SUMMARY - Who Regulates the Feed? Platform Power & Public Interest
Who Regulates the Feed? Platform Power & Public Interest
Social media platforms shape what billions of people see, read, and believe. Their algorithms determine which voices are amplified and which are suppressed; their policies define acceptable speech; their enforcement actions can silence individuals or movements. This power over public discourse rivals that of governments—yet platforms operate as private companies with limited public accountability. Understanding platform power and debates about its governance helps citizens engage with one of the defining questions of contemporary democracy.
The Scope of Platform Power
Billions of users depend on platforms. Facebook, YouTube, Twitter/X, TikTok, and similar platforms reach populations larger than most countries. What happens on these platforms affects public discourse globally.
Algorithmic curation decides what spreads. Platforms don't just host content—they actively promote some and suppress others. These curatorial decisions shape which ideas reach audiences and which disappear without notice.
Content moderation defines acceptable speech. Platform rules about hate speech, misinformation, harassment, and other content categories determine what can be said. Enforcement of these rules removes content and bans users.
Economic power enables platform leverage. Platforms' dominance over digital advertising gives them economic power over publishers and creators. Dependence on platform distribution makes content producers vulnerable to platform decisions.
Private Governance of Public Discourse
Platforms make quasi-governmental decisions. Defining speech rules, enforcing them, and punishing violations are functions historically associated with states. Platforms perform these functions privately.
Terms of service function as law. Users must agree to platform rules that govern their behavior. These private contracts effectively regulate speech with limited opportunity for negotiation or appeal.
Scale makes platform decisions consequential. When a platform with billions of users changes its policies, the effects are enormous. Small policy tweaks can significantly reshape public discourse.
Accountability is limited. Platforms aren't elected, don't face referendum, and can't be voted out. Users can exit in theory, but network effects make exit costly. Democratic accountability mechanisms don't apply.
The Case for Platform Autonomy
Private property rights protect platform choices. Platforms are private companies entitled to set rules for their services. Government regulation of their editorial choices raises First Amendment (in the US) or free expression concerns.
Platforms moderate harmful content. Without platform moderation, online spaces become hostile environments dominated by harassment, spam, and abuse. Some governance of these spaces serves users.
Government regulation could be worse. Governments might use regulatory power to censor political speech, suppress dissent, or favor incumbent politicians. Private platforms may protect speech that governments would restrict.
Competition could discipline platforms. If users dislike platform policies, they can use alternatives. Market competition could constrain platform power without government intervention—if competition actually exists.
The Case for Public Regulation
Public functions warrant public accountability. When private companies perform public functions—governing speech in public squares—public accountability becomes appropriate regardless of private ownership.
Market failures limit competition. Network effects, switching costs, and data advantages create monopolistic dynamics. Competition that might discipline platforms in theory doesn't exist in practice.
Power concentration threatens democracy. Unaccountable control over public discourse by a few companies concentrated in one country poses risks to democratic self-governance everywhere.
Externalities affect non-users. Platform decisions affect societies beyond their users. Election integrity, public health, and social cohesion all are affected by platform choices that users alone can't adequately govern.
Regulatory Approaches
Transparency requirements mandate disclosure. Laws requiring platforms to explain their algorithms, report on moderation, and provide researcher access would reduce opacity without controlling content decisions.
Due process requirements protect users. Requiring notice of enforcement actions, opportunity to respond, and meaningful appeal would give users procedural protections against arbitrary treatment.
Interoperability requirements reduce lock-in. Mandating that platforms work together—allowing users to communicate across platforms or take their data and connections when leaving—would reduce network effect barriers to competition.
Liability rules shape incentives. Adjusting platform liability for user content—currently limited by intermediary protection laws—would change moderation incentives, though in contested directions.
Antitrust enforcement addresses concentration. Breaking up dominant platforms or preventing acquisitions of competitors would create more competition and reduce any single platform's power.
Challenges of Regulation
Global platforms face national regulations. Platforms operate globally while laws are national. How to regulate global services with territorial rules remains unsolved.
Technical complexity challenges regulators. Platforms' AI systems, recommendation algorithms, and technical infrastructure are difficult for non-technical regulators to understand and oversee.
Rapid change outpaces law. Technology evolves faster than legislation. By the time laws pass, the platforms and practices they targeted may have changed.
Enforcement against well-resourced companies is difficult. Platforms can afford the best lawyers, can lobby against regulations, and can restructure to evade rules. Enforcement requires sustained will and resources.
Platform Self-Governance
Oversight boards create quasi-judicial review. Facebook's Oversight Board reviews moderation decisions with some independence from the company. This model provides appeal and legitimacy—though critics question its actual independence.
Industry standards could establish norms. Platform companies agreeing to common standards for moderation, transparency, and due process could improve practices across the industry—if agreements have teeth.
Stakeholder governance would include affected parties. Governance structures that include users, civil society, advertisers, and creators alongside company management could broaden accountability—if platforms would accept such constraints.
Democratic Stakes
Information environment shapes democracy. What people know, believe, and discuss depends increasingly on platforms. Democratic deliberation requires an information environment that serves public interest.
Concentrated power threatens democratic values. Whether held by governments or corporations, concentrated unaccountable power over speech threatens democratic norms. Platform power raises these concerns regardless of who exercises it.
Citizens have stakes in platform governance. As users, as members of societies affected by platforms, and as democratic citizens, people have legitimate interests in how platforms are governed—interests that current arrangements don't adequately address.
Conclusion
Platform power over public discourse is immense and inadequately accountable. Private governance of functions this consequential raises democratic concerns that neither pure market competition nor traditional regulation fully addresses. Finding governance arrangements that constrain platform power, protect speech, serve public interest, and respect legitimate platform autonomy is one of the defining challenges of contemporary democracy. The question of who regulates the feed admits no easy answer—but pretending the question doesn't exist serves only those who benefit from unaccountable power.