SUMMARY - Regulating Big Tech Power
A small business owner watches her company disappear overnight after a platform changes its algorithm, the traffic that sustained her enterprise vanishing without explanation, appeal, or recourse, her years of work building a business dependent on a platform she does not control evaporating because decisions made in a distant headquarters by people she will never meet determined that her visibility would end, the platform under no obligation to explain, justify, or reverse what it has done. An entrepreneur develops an application that gains traction, attracting the attention of a dominant platform that first offers to acquire it, then, when refused, launches a competing feature using data gathered from observing the application's success on its platform, the startup unable to match resources that can be deployed against it and watching its user base migrate to the integrated alternative, the platform having used its position to identify, copy, and crush a potential competitor. A researcher attempts to study how a platform's recommendation systems affect public discourse, finding that the data necessary for research is controlled by the platform, that the platform's terms prohibit the access her research would require, and that the opacity she seeks to penetrate is itself the barrier that prevents penetration, the subjects of her study controlling what can be known about them. A legislator proposes breaking up a technology conglomerate, facing response that the company's services are so integrated that separation would harm consumers, that the company's scale enables investments that benefit everyone, and that regulation would advantage foreign competitors, the arguments perhaps self-serving but not obviously wrong, the question of whether breakup would help or harm genuinely difficult to answer. A user contemplates leaving a platform where her friends, photos, professional network, and digital history reside, finding that departure means losing connections she cannot recreate elsewhere, that the friction of switching exceeds what she can bear, and that her theoretical freedom to leave masks practical captivity to a service she increasingly dislikes but cannot abandon. The concentration of power in a handful of technology companies has become one of the defining features of the contemporary economy, raising questions that existing regulatory frameworks were not designed to answer: what happens when private companies control infrastructure as essential as public utilities once were, when market dominance enables not just economic power but influence over information, discourse, and democratic participation itself, and when the tools developed to address industrial-age monopolies may not fit digital-age platforms whose power operates through mechanisms that antitrust law struggles to see.
The Case for Aggressive Intervention
Critics argue that technology platform concentration has reached levels that threaten economic competition, democratic governance, and individual autonomy, that existing regulatory tools have proven inadequate, and that aggressive intervention including structural separation is necessary to address harms that behavioral remedies cannot reach. From this view, the power that has accumulated requires response commensurate with the threat.
Concentration has reached unprecedented levels. A handful of companies dominate search, social media, mobile operating systems, cloud computing, e-commerce, and digital advertising. Their combined market capitalization exceeds the GDP of most nations. The concentration is not merely economic but extends to control over information flows, public discourse, and the infrastructure on which modern life depends. No previous era has seen such power concentrated in so few private hands.
Market power produces documented harms. Dominant platforms can raise prices, reduce quality, and stifle innovation without losing customers who have nowhere else to go. Sellers dependent on platforms face terms they cannot negotiate. Competitors face barriers they cannot overcome. The harms from market power are not theoretical but observable in higher prices, reduced choice, and suppressed innovation.
Network effects and data advantages create durable dominance. Platforms become more valuable as more people use them, making established platforms difficult to displace regardless of quality. Data accumulated from users enables targeting and personalization that newcomers cannot match. The dynamics that created dominance perpetuate it. Competition that might discipline other markets cannot discipline platforms whose dominance is self-reinforcing.
Acquisitions have eliminated potential competitors. Platforms have acquired hundreds of companies, many of which might have grown into competitive threats. Instagram, WhatsApp, YouTube, and countless smaller acquisitions removed potential competitors before they could challenge incumbents. The pattern suggests strategy to eliminate competition rather than legitimate business expansion.
Platform power extends beyond economics to governance. Platforms make decisions about speech, access, and visibility that affect democratic participation. Their content moderation choices shape public discourse. Their algorithmic amplification determines what ideas spread. Private companies exercising quasi-governmental power without democratic accountability represents threat to self-governance that economic analysis alone does not capture.
From this perspective, addressing platform power requires: recognition that existing frameworks have failed to prevent harmful concentration; structural remedies including breakup where behavioral remedies are insufficient; prohibition of acquisitions that eliminate potential competition; interoperability requirements that reduce switching costs and enable competition; and understanding that platform power is not merely economic but threatens democratic governance itself.
The Case for Measured Response
Others argue that platform concentration reflects genuine efficiencies that benefit consumers, that aggressive intervention risks harming the innovation and services people value, and that measured approaches addressing specific harms serve better than structural intervention whose consequences are uncertain. From this view, the power platforms have accumulated reflects success in serving consumers that intervention might undermine.
Consumers have benefited enormously. The services dominant platforms provide are often free to users, high quality, and continuously improving. Search that organizes the world's information, social connections across distance, access to vast product selection, and countless other benefits flow from platforms people choose to use. The claim that consumers are harmed sits uneasily with the reality that consumers choose these services over alternatives.
Scale produces efficiencies that benefit everyone. Large platforms can invest in research, security, and infrastructure that smaller competitors cannot match. Cloud computing that powers countless businesses benefits from scale. The investments that platform revenue enables produce innovations that benefit users and the broader economy. Breaking up platforms might sacrifice efficiencies that serve public interest.
Market dominance is not permanent. Technology history shows dominant companies displaced by innovation. IBM dominated computing until Microsoft displaced it. Microsoft dominated until mobile platforms emerged. Today's dominance may be tomorrow's obsolescence. The market disciplines that intervention would provide may be unnecessary because technological change provides them.
Intervention carries risks. Structural separation of integrated services might harm consumers who benefit from integration. Regulatory remedies might entrench incumbents who can manage compliance that challengers cannot. Intervention designed for American companies might advantage foreign competitors, particularly Chinese platforms that would not face equivalent constraints. The consequences of intervention are uncertain and might be worse than the problems intervention addresses.
Existing tools may suffice. Antitrust enforcement has increased. Regulatory frameworks for specific harms are developing. Platform accountability for content, competition, and consumer protection can be achieved through targeted intervention. Structural breakup is not the only option.
From this perspective, appropriate response requires: recognition that platform services provide genuine consumer benefit; attention to whether specific harms warrant specific remedies rather than structural intervention; caution about intervention whose consequences cannot be predicted; consideration of international competitive effects; and measured approach that addresses identified problems without destroying what works.
The Antitrust Framework Challenge
Existing antitrust law developed to address industrial-age monopolies may not fit digital platforms whose power operates differently.
Traditional antitrust focuses on consumer welfare measured primarily through prices. The framework assumes that monopoly power manifests in higher prices that harm consumers. Intervention is warranted when prices exceed competitive levels.
Platform economics challenge this framework. Many platform services are free to users, supported by advertising or data monetization. If price is zero, price-based analysis suggests no consumer harm regardless of market power. The framework designed for industrial monopolies charging above-competitive prices does not capture harms from platforms that charge nothing.
Two-sided markets complicate analysis. Platforms serve multiple constituencies: users, advertisers, sellers, developers. Harms to one side might benefit another. Analysis that focuses on one side misses dynamics that operate across sides. Traditional single-market analysis may not capture platform economics.
Network effects and tipping create winner-take-all dynamics that traditional analysis may miss. Markets that tip to single dominant platform differ from markets where multiple competitors can coexist. Analysis designed for the latter may not adequately address the former.
From one view, antitrust law requires updating to address platform economics. New legal frameworks that recognize platform-specific dynamics should replace or supplement existing doctrine.
From another view, existing antitrust tools are adequate if properly applied. The consumer welfare standard can encompass quality, innovation, and privacy harms beyond price. Existing tools need application, not replacement.
From another view, antitrust alone cannot address platform power. Competition policy addresses market power; platform power extends to governance, discourse, and democracy. Multiple frameworks are needed for problems that exceed antitrust's scope.
Whether existing antitrust frameworks can address platform power or whether new frameworks are needed shapes regulatory approach.
The Market Definition Problem
Antitrust analysis traditionally begins with defining relevant markets, but defining markets for platforms is contested.
Is the relevant market search, or advertising, or information services more broadly? Is social media one market, or are different types of social platforms different markets? Is Amazon competing in retail, in e-commerce, in cloud services, or in all of these?
Market definition affects assessment of dominance. Narrow definition makes dominance appear greater; broad definition makes it appear smaller. The definition often determines the outcome.
From one view, traditional market definition cannot capture platform competition. Platforms compete across traditional market boundaries. Zero-price services challenge frameworks that assume price competition defines markets.
From another view, market definition remains essential for disciplined analysis. Without market definition, claims about dominance become unfounded assertion. The challenge is defining markets correctly, not abandoning market definition.
From another view, market definition has become tool for manipulation. Sophisticated advocates can define markets to produce desired conclusions. The framework enables strategic argumentation more than objective analysis.
How to define markets for platform services and whether market definition remains appropriate analytical tool shapes antitrust application.
The Consumer Harm Debate
What constitutes consumer harm from platform dominance is contested.
Price harm may be minimal when services are free. But consumers may pay through data they provide, attention they give, or reduced quality they cannot observe. Non-price harms including privacy degradation, addictive design, and reduced innovation may matter more than price.
Quality harm from reduced innovation may be significant but difficult to measure. Startups not funded, products not developed, and innovations not pursued because platform dominance deters them are invisible. The counterfactual of what would have emerged with more competition cannot be observed.
From one view, consumer harm from platforms is substantial even if not reflected in prices. Privacy degradation, manipulation, and reduced innovation are real harms that analysis should recognize.
From another view, claimed harms are speculative. Without observable price increases or quality reductions, consumer harm claims rest on assumptions that may not be accurate.
From another view, the consumer welfare standard should be supplemented or replaced. Harms to competition, to democracy, and to workers matter regardless of consumer effects. A broader framework than consumer welfare is needed.
What harms platform dominance produces and how to measure them shapes intervention justification.
The Killer Acquisition Problem
Platforms have acquired hundreds of companies, raising questions about acquisitions that eliminate potential competition.
Instagram's acquisition by Facebook in 2012 for one billion dollars has become paradigmatic example. A potential competitor to Facebook's social networking dominance was acquired before it could fully challenge that dominance. The acquisition was approved under then-applicable standards; whether it should have been remains debated.
WhatsApp, YouTube, Android, and countless smaller acquisitions followed similar patterns: platforms acquiring potential competitors before they could threaten incumbent positions.
From one view, killer acquisitions have been primary mechanism through which platforms maintained dominance. Stricter merger enforcement would have produced more competitive markets.
From another view, acquisitions often benefit acquired companies and their users. Instagram thrived under Facebook ownership. WhatsApp expanded globally. Acquisitions that appear anticompetitive may have produced benefits that independent operation would not.
From another view, the harm from killer acquisitions cannot be known because the counterfactual cannot be observed. Whether Instagram would have become meaningful Facebook competitor absent acquisition is unknowable.
Whether acquisitions have harmed competition and how to address future acquisitions shapes merger enforcement.
The Structural Separation Question
Structural remedies including breakup represent most aggressive intervention possibility.
Breaking up dominant platforms could separate different lines of business: search from advertising, marketplace from retail operations, social networking from messaging. Structural separation would prevent dominant platforms from leveraging dominance across lines of business.
Historical precedent exists. Standard Oil was broken up in 1911. AT&T was broken up in 1984. The resulting companies competed in ways the integrated monopoly prevented.
From one view, structural separation is necessary because behavioral remedies cannot address platform power. Integration enables leveraging that conduct remedies cannot effectively prevent. Only structural separation removes the ability to leverage.
From another view, structural separation would harm consumers who benefit from integration. Services work together. Separating them would sacrifice functionality people value. The integration that enables leveraging also enables consumer benefit.
From another view, separation would be practically difficult. Integrated systems cannot be easily divided. The technical challenge of separation exceeds what remedies in other industries required.
Whether structural separation would address platform power, whether it is practically feasible, and whether benefits would exceed costs shapes remedy debate.
The Interoperability Alternative
Interoperability requirements could enable competition without structural breakup.
Mandating interoperability would require platforms to allow communication with competing services. Users could message across platforms, port data between services, or use alternative clients for platform content.
Historical precedent exists. Telephone interoperability required competing networks to connect. Email interoperability enables communication across providers. Interoperability can enable competition while preserving network benefits.
From one view, interoperability would reduce switching costs that entrench incumbents. Users could gradually migrate without losing connections. Competition could emerge on top of existing networks.
From another view, interoperability would create security and quality concerns. Opening systems to third-party access creates vulnerabilities. Quality control becomes difficult when platforms cannot control what connects.
From another view, interoperability may not address underlying power. Platforms might remain dominant even with interoperability. Technical requirements might entrench incumbents who design interoperability to their advantage.
Whether interoperability would enable competition and what interoperability requirements should look like shapes regulatory design.
The Data Portability Dimension
Data accumulated on platforms creates competitive advantage that portability requirements might address.
Users generate data through platform use that platforms then use for competitive advantage. Data portability would enable users to take their data to competing services, potentially reducing data-based barriers to competition.
GDPR and other frameworks establish data portability rights. But exercising portability is cumbersome, and receiving services must be able to use ported data for portability to enable switching.
From one view, meaningful data portability could reduce lock-in and enable competition. Users who can take their data are freer to switch.
From another view, data portability alone is insufficient. Network effects persist regardless of data. The connections users have matter more than the data they can export.
From another view, data portability creates privacy and security concerns. Data ported to new services creates new risks. The privacy framework that applied to original data may not apply after portability.
Whether data portability can meaningfully enable competition and what portability requirements should include shapes regulatory design.
The Self-Preferencing Problem
Platforms that both operate marketplaces and sell on those marketplaces can preference their own products.
Amazon both hosts third-party sellers and sells its own products. It can observe what sells, copy successful products, and advantage its offerings in search and buy box placement. Similar dynamics apply to Apple's App Store, Google's search results, and other platforms that compete with those they host.
From one view, self-preferencing represents core competition concern. Platforms leveraging marketplace position to advantage their own offerings undermine competition that marketplaces should enable.
From another view, vertical integration that includes self-preferencing can be efficient. Integrated offerings may serve consumers better. Private labels compete with branded products in physical retail without general prohibition.
From another view, the power to preference itself distinguishes platform contexts from traditional retail. Platform control over visibility, discovery, and purchase flow creates leverage that traditional retail does not provide.
Whether self-preferencing should be prohibited, regulated, or accepted shapes platform competition policy.
The Gatekeeper Framework
The European Union's Digital Markets Act introduces gatekeeper concept that defines obligations for platforms meeting specified criteria.
Gatekeepers are platforms with significant impact on internal market, operating core platform services that serve as important gateways for business users to reach end users, and enjoying entrenched and durable position. Designated gatekeepers face specific obligations including prohibitions on self-preferencing, requirements for interoperability, and data access provisions.
From one view, the gatekeeper framework appropriately targets obligations at platforms whose position warrants specific regulation. Not all platforms are gatekeepers; those that are should face obligations that reflect their position.
From another view, the gatekeeper framework is regulatory overreach. Designating specific companies for specific obligations based on size creates discriminatory regulation. Neutral competition principles should apply to all, not special obligations for the successful.
From another view, the gatekeeper framework is experiment whose effects remain to be seen. Early implementation will reveal whether the approach produces intended benefits.
Whether the gatekeeper concept represents appropriate regulatory innovation and how it should be applied shapes platform regulation.
The Content Moderation Power
Platform power includes control over content that affects public discourse.
Platforms decide what speech is permitted, what is amplified, and what is suppressed. These decisions affect political debate, shape public understanding, and determine what ideas spread. Content moderation is governance power exercised by private entities.
From one view, platform content control represents inappropriate private power over public discourse. Decisions that affect democracy should not be made by unaccountable private entities. Platform content power should be constrained.
From another view, platforms are private entities entitled to set their own rules. First Amendment concerns apply to government, not private actors. Platforms should be free to moderate as they choose.
From another view, the scale of platform influence transforms traditional private speech analysis. Platforms that effectively control public discourse cannot be analyzed as ordinary private speakers. Their power warrants public accountability regardless of private status.
Whether platform content moderation power should be regulated and how shapes free expression in digital age.
The Algorithmic Amplification Question
Beyond what content is permitted, platform algorithms determine what content is amplified.
Recommendation systems, news feeds, and search results shape what users see. These systems can amplify content that generates engagement regardless of accuracy, social value, or harm. Algorithmic amplification may spread misinformation, extremism, and divisive content because such content drives engagement that serves platform business models.
From one view, algorithmic amplification creates harms that content moderation alone cannot address. Platforms may allow content but amplify it in ways that produce social harm. Regulating amplification may be necessary alongside content rules.
From another view, regulating algorithms intrudes on platform editorial discretion. How to order content is speech that platforms are entitled to make. Government regulation of algorithms raises First Amendment concerns.
From another view, transparency about algorithmic function may be more appropriate than regulation of algorithmic choices. Enabling users and researchers to understand how algorithms work could address harms without prescribing algorithmic design.
Whether and how to address algorithmic amplification shapes platform accountability.
The Political Influence Dimension
Platform power extends to political processes through advertising, information access, and mobilization capacity.
Political advertising on platforms can be microtargeted, personalized, and delivered at scale in ways traditional advertising cannot match. The data platforms possess about users enables political targeting that can manipulate democratic participation.
Information access affects what citizens know. Platforms that determine what news reaches users shape the information environment within which citizens form opinions and make political choices.
Mobilization capacity enables platforms to influence political outcomes. Platforms that can contact millions of users instantly have capacity to shape political participation in ways few other entities can match.
From one view, platform political power threatens democratic governance. Private entities with such influence over democratic processes undermine self-governance.
From another view, platforms provide political participation opportunities that prior media did not. Citizens can organize, access information, and participate politically through platforms in ways previously impossible.
From another view, platform political power requires transparency and limits but not elimination. Democratic societies can regulate political advertising, require disclosure, and limit targeted manipulation without eliminating platform political speech.
What limits on platform political influence are appropriate and how to balance them with platform speech rights shapes democratic regulation.
The Labor and Gig Economy Dimension
Platform power extends to labor relationships through gig economy models.
Platforms have created work arrangements where workers are classified as independent contractors, lacking employment protections, benefits, and collective bargaining rights that employment provides. Platform control over work allocation, pricing, and access creates power over workers who formally are not employees.
From one view, gig economy arrangements represent exploitation enabled by platform power. Workers bear risks while platforms capture value. The classification as independent contractors evades responsibilities that employment would require.
From another view, gig economy arrangements provide flexibility that workers value. Many prefer independent contractor status to traditional employment. Platform work enables work arrangements that traditional employment does not.
From another view, the classification question should be resolved through labor law rather than competition policy. Whether gig workers should be employees is labor question that platform power analysis does not directly address.
How platform labor arrangements relate to platform power and what regulation is appropriate shapes gig economy policy.
The International Competition Dimension
Platform power operates globally, raising questions about international competitive effects of regulation.
American platforms dominate globally. European and other regulation of American platforms shapes global digital economy. Chinese platforms present alternative that might benefit from constraints on American competitors.
From one view, regulating American platforms advantages Chinese competitors who face no equivalent constraints. Weakening American platforms in the name of competition might produce global dominance by entities with less commitment to open internet values.
From another view, the international competition argument is deployed strategically to avoid regulation. American platforms invoke Chinese competition to resist constraints regardless of whether those constraints would actually benefit Chinese competitors.
From another view, international coordination on platform regulation would address competitive concerns. If all jurisdictions applied similar frameworks, competitive distortion would be reduced.
How international competition affects platform regulation decisions and whether it should shapes geopolitical dimension.
The Innovation Effects
Platform power may affect innovation in ways that competition analysis should consider.
From one view, platform dominance suppresses innovation. Startups that might challenge incumbents cannot attract funding when platforms can copy or acquire them. The innovation that would occur with more competitive markets does not occur under platform dominance.
From another view, platforms drive innovation. The resources platform revenue provides enable research and development at scale that smaller competitors cannot match. Platform APIs and ecosystems enable innovation by others. Platforms may produce more innovation than competition would.
From another view, innovation effects are complex and context-dependent. Platforms may suppress some innovation while enabling other innovation. The net effect is difficult to assess.
Whether platform dominance increases or decreases innovation and how to assess innovation effects shapes intervention justification.
The Regulatory Capacity Challenge
Effective platform regulation requires capacity that regulators may lack.
Technical understanding of platform operations, economic expertise in platform markets, and legal capacity to pursue complex cases require resources and expertise that regulatory agencies may not possess.
Platform resources for litigation, for regulatory engagement, and for delaying tactics exceed what agencies can match. Regulatory proceedings can extend for years while platforms continue operating.
From one view, regulatory capacity limitations doom platform regulation to ineffectiveness. Agencies cannot match platform resources. Whatever rules are enacted cannot be effectively enforced.
From another view, regulatory capacity can be built. Investment in agency expertise and resources would enable effective regulation. Capacity limitations reflect choice, not necessity.
From another view, regulatory design should account for capacity constraints. Rules that are simpler to enforce may be more effective than complex rules that exceed enforcement capacity.
Whether regulatory capacity can match platform complexity and how to address capacity limitations shapes regulatory feasibility.
The Digital Markets Act and Platform Regulation
The European Union's Digital Markets Act represents most comprehensive attempt at platform-specific regulation.
The DMA establishes ex ante obligations for designated gatekeepers. Rather than waiting for harm and pursuing ex post enforcement, the DMA imposes ongoing obligations that gatekeepers must satisfy. Obligations include prohibitions on self-preferencing, interoperability requirements, and limits on data combination.
From one view, the DMA represents regulatory innovation that addresses platform power that antitrust alone cannot. Ex ante obligations provide certainty and prevent harms rather than addressing them after the fact.
From another view, the DMA represents regulatory overreach that will harm innovation and disadvantage European consumers. Platform-specific regulation that applies primarily to American companies raises trade and competitiveness concerns.
From another view, the DMA is experiment whose effects remain uncertain. Implementation will reveal whether the approach achieves intended goals without unintended harms.
What the DMA represents and whether its approach should be adopted elsewhere shapes platform regulation globally.
The American Legislative Proposals
Various American legislative proposals would address platform power through different mechanisms.
Proposals have included prohibitions on self-preferencing, requirements for interoperability, restrictions on acquisitions, and structural separation requirements. Different proposals reflect different theories about what platform harms are and how to address them.
From one view, comprehensive platform legislation is necessary to address harms that existing law cannot reach. Antitrust enforcement alone is insufficient.
From another view, existing antitrust law adequately addresses competition concerns if properly enforced. New legislation is unnecessary and risks unintended consequences.
From another view, bipartisan concern about platform power could produce legislation despite partisan division on other technology issues. Platform regulation may represent rare area of potential agreement.
Whether American platform legislation will be enacted and what form it might take shapes American regulatory trajectory.
The Enforcement Trajectory
Antitrust enforcement against platforms has intensified.
The Department of Justice has pursued cases against Google. The Federal Trade Commission has challenged Meta. State attorneys general have filed their own suits. The enforcement trajectory has shifted toward more aggressive platform scrutiny.
From one view, increased enforcement demonstrates that existing tools can address platform power. Enforcement, not new legislation, is what is needed.
From another view, enforcement cases are slow and outcomes uncertain. Litigation that takes years to resolve cannot address rapidly evolving markets. Enforcement complements but cannot substitute for ex ante regulation.
From another view, the enforcement trajectory may shift with administrations. Aggressive enforcement reflects current priorities that future leadership might not share.
What enforcement cases will achieve and whether enforcement can address platform power shapes assessment of existing tools.
The Private Litigation Dimension
Private antitrust litigation complements public enforcement.
Companies harmed by platform practices can pursue private damages actions. Developers, advertisers, and competitors have filed suits alleging platform antitrust violations. Private litigation can provide remedies that public enforcement does not and can generate information useful for public enforcement.
From one view, private litigation extends enforcement capacity beyond what public agencies can provide. Private attorneys general multiply enforcement resources.
From another view, private litigation may not serve public interest. Strategic suits may burden competitors rather than vindicate competition. Settlement incentives may not align with public interest.
From another view, private litigation complements but cannot substitute for public enforcement. Private parties pursue their own interests; public enforcers can pursue broader public interest.
What role private litigation should play in platform competition enforcement shapes litigation policy.
The Remedies Design Challenge
If platform power requires intervention, what remedies would be effective is contested.
Structural remedies including breakup would separate integrated operations. But integration may benefit consumers, and separated entities might reconsolidate.
Behavioral remedies would prohibit specific conduct. But behavioral remedies are difficult to monitor and enforce. Platforms can find alternative means to achieve similar ends.
Interoperability remedies would require opening platforms to competitors. But mandated interoperability creates technical challenges and may not address underlying power.
From one view, the difficulty of designing effective remedies should not prevent intervention. Imperfect remedies are better than no intervention.
From another view, remedies whose effectiveness is uncertain risk doing more harm than good. Intervention should proceed cautiously when remedy design is unsettled.
From another view, different remedies may suit different contexts. Structural separation might be appropriate for some platforms while behavioral remedies suit others.
What remedies would effectively address platform power and how to design them shapes intervention strategy.
The Small Business and Seller Perspective
Businesses that depend on platforms for market access have particular stake in platform regulation.
Small businesses that sell through Amazon, advertise through Google, or reach customers through Facebook depend on platforms they do not control. Platform decisions about visibility, fees, and access can determine business survival.
From one view, small business dependence on platforms demonstrates platform power that regulation should address. Businesses that can be destroyed by platform decisions face coercive power that competition should check.
From another view, platforms provide market access that small businesses could not otherwise achieve. Before platforms, small businesses could not reach global customers. Dependence reflects benefit that platform access provides.
From another view, small business concerns about platforms may not align with consumer welfare. Interventions that address seller concerns might not benefit consumers and might even harm them.
What small business dependence reveals about platform power and how small business interests should figure in regulation shapes stakeholder analysis.
The Worker and Creator Perspective
Workers and creators who generate value on platforms face power imbalances that regulation might address.
Gig workers, content creators, and developers create value that platforms capture. The terms on which they operate are set by platforms without meaningful negotiation. Collective action that might check platform power faces legal and practical obstacles.
From one view, worker and creator concerns demonstrate platform power that extends beyond traditional market analysis. Labor and creator perspectives should inform platform regulation.
From another view, worker and creator concerns are better addressed through labor law and intellectual property than through competition policy. Platform power analysis may not be appropriate frame.
From another view, worker and creator organizing could provide countervailing power that regulation alone cannot. Supporting collective action might address power imbalances that regulation struggles to reach.
How worker and creator interests relate to platform power and what protections are appropriate shapes labor and creator policy.
The Canadian Context
Canada addresses platform power within its particular circumstances.
Canada's Competition Bureau has examined platform markets and pursued some enforcement actions. Canadian courts have addressed platform competition issues. But Canadian enforcement capacity is limited relative to platform scale.
Canadian policy exists in context of American platform dominance. American platforms serving Canadian users raise jurisdictional questions about which rules apply and which authorities can enforce.
Canada has enacted platform-specific legislation including the Online Streaming Act and Online News Act addressing content and news compensation. These represent Canadian assertion of regulatory authority over platforms operating in Canada.
From one perspective, Canada should develop stronger platform competition frameworks that assert Canadian interests against global platforms.
From another perspective, Canadian market size limits regulatory leverage. Platforms might withdraw from Canadian market rather than comply with requirements they consider unreasonable.
From another perspective, Canada should coordinate with like-minded nations to develop platform regulation that achieves leverage Canadian action alone cannot.
How Canada addresses platform power and what leverage Canada possesses shapes Canadian policy.
The Democratic Accountability Question
Platform power raises fundamental questions about accountability in democratic societies.
Platforms exercise power that affects democratic participation, public discourse, and individual autonomy. This power is not democratically accountable. Platform governance decisions are made by private entities accountable to shareholders, not citizens.
From one view, democratic accountability requires subjecting platforms to democratic control. Private power of this magnitude cannot be compatible with democratic self-governance.
From another view, democratic accountability through regulation rather than direct control is appropriate. Platforms remain private but operate within democratically determined rules.
From another view, some platform power should not be democratically controlled. Government control over speech platforms raises censorship concerns. Private platform power may be preferable to government speech control.
What democratic accountability for platform power requires and how to achieve it shapes governance theory.
The Future Trajectories
Platform power regulation may develop in various directions.
A fragmentation trajectory would see platforms broken up or required to interoperate, producing more competitive markets with multiple providers.
A utility trajectory would see dominant platforms regulated as public utilities, with price, access, and service obligations that utilities face.
A continuation trajectory would see platform dominance persist despite regulatory effort, with platforms adapting to constraints while maintaining fundamental power.
Which trajectory materializes depends on regulatory choices, enforcement outcomes, and technological change.
The Fundamental Questions
Regulating big tech power raises fundamental questions about markets, governance, and democracy.
Can market competition discipline platforms whose power derives from network effects and data advantages that competition cannot easily overcome?
Can behavioral remedies effectively constrain platform conduct when platforms can adapt to rules while maintaining underlying power?
Can structural remedies produce competition without sacrificing efficiencies that benefit consumers?
Should platform power be understood through economic framework or as governance power requiring political accountability?
These questions will shape platform regulation regardless of specific interventions attempted.
The Question
If a handful of companies control infrastructure as essential as public utilities once were, if their market power enables them to shape what businesses can reach customers, what ideas can spread, and what innovations can emerge, if existing antitrust frameworks developed for industrial-age monopolies may not fit digital-age platforms whose power operates through network effects, data advantages, and ecosystem control that traditional analysis struggles to capture, and if the acquisitions and practices that produced this concentration were permitted under rules that may have been inadequate to the challenge, should the response be aggressive intervention including structural breakup that might restore competition but risks harming consumers who benefit from integration, measured behavioral regulation that addresses specific harms while preserving platform benefits, acceptance that platform concentration reflects genuine efficiencies that intervention should not disturb, or something else entirely that current frameworks have not yet developed? When platforms can destroy businesses by changing algorithms, can copy and crush potential competitors, can control information flows that affect democratic participation, and can exercise governance power without democratic accountability, when the power they have accumulated exceeds what any previous private entities have possessed, and when their capacity to influence the political processes that might constrain them raises questions about whether democratic regulation is even possible, what regulatory approach can address platform power without producing harms greater than those it addresses, what remedies might actually work rather than merely provide appearance of action, and whether the choice is between imperfect intervention and unacceptable concentration or whether possibilities exist that neither defenders of platform power nor critics have yet adequately articulated?