The internet has been an incredibly powerful tool for enabling users to share a wide variety of content. Some have argued that large online platforms like Facebook and Twitter have too much power over what speech can be heard. Often this leads to calls to change Section 230, a critical legal protection that enables content moderation and protects platforms from being held liable for their users’ content, or is used as evidence of these platforms’ “monopoly” power. To use antitrust enforcement to address concerns about content moderation is unlikely to result in the desired policy changes and could set a dangerous precedent for abusing antitrust enforcement for non-competition related purposes. Policymakers should not ignore the important role Section 230 plays in enabling a dynamic market for services to host user-generated content well-beyond the social media context. History has shown that it is often hard to predict which small company may prove to be an innovative and disruptive success, but a framework including Section 230 that allows new entrants to start with minimal regulation is most likely to yield a competitive marketplace and benefit consumers.

By Jennifer Huddleston1

 

I. INTRODUCTION

In January 2021, several social media platforms banned then-President Donald Trump’s accounts. Similarly, Amazon Web Services cloud hosting and Apple and Google’s app stores removed the social media app Parler for violating their terms of service. These actions led some to question whether the ability of these platforms to “silence” a sitting president and remove a potential new entrant was evidence of “monopoly” power. But even prior to the events of January 2021, some critics had argued that Section 230, a liability protection that allows a wide range of online services to engage in content moderation actions and prevents them from being held liable for content created by their users in most cases, was unfairly benefiting “Big Tech” companies. These critics argue that Section 230 has become a special privilege that provides tech giants with unfair power and some even suggest that this necessitates breaking up these successful companies.

But only looking at large companies misunderstands the critical role Section 230 plays in enabling platforms of all sizes to host user-generated content. Focusing only on the alleged harms of specific content moderation actions misses the benefits that a vibrant marketplace for many different types of user-generated content has had on social media. In many ways changing Section 230 would make it more difficult for new entrants and smaller players, while pursuing antitrust action to address concerns about content moderation decisions would set a dangerous precedent for the use of antitrust law and be unlikely to result in the desired policy changes.

 

II. ANTITRUST AND CONTENT MODERATION

Pursuing antitrust action to achieve changes in large platforms’ content moderation decisions would both set a dangerous precedent for the potential political abuse of antitrust and would be unlikely to achieve its desired policy outcomes. There remain a growing number of platforms that compete to host user-generated content in a dynamic and ever-changing market. Rather than being harmed, consumers are benefitting from having greater access to information and to services on which to share their own content. Antitrust is an ill-fit policy tool to address concerns about content moderation decisions.

A. The Market for User-Generated Content Remains Dynamic

The internet is a dynamic and booming market for a range of user-generated content including social media. A little more than a decade ago, antitrust concerns regarding big tech were focused on players such as America Online (“AOL”) and MySpace. These concerns missed the overall dynamics and changes that new market players such as Facebook and Google were bringing.2 While today’s giants might appear unstoppable and dominant, new players such as TikTok and Clubhouse provide different models of social media and are gaining traction particularly with younger users.3

This competition is even clearer if one dismantles social media platforms into their many different components. For example, Facebook’s WhatsApp competes with other encrypted messaging services such as Signal as well as other forms of messaging such as iMessage and Google Chat. Google’s YouTube must compete with an increasing array of streaming entertainment options including similar free and consumer-generated services such as Twitch. This explosion has benefited consumers including providing additional competitors to previous services.

Innovation is often our best competition policy and can completely disrupt existing markets. In the case of services hosting user-generated content, there is not the clear evidence of behavior typically associated with a monopolist.4 For example, while AT&T’s breakup increased competition among long distance providers, it was the expansion of new technologies such as voice over internet phone (VOIP) and mobile that truly transformed the consumer experience.5 Similarly in focusing only on brick-and-mortar video rental, antitrust regulators missed that new dynamic competitors were changing the way consumers consumed home entertainment.6

B. Antitrust Remedies Would Not Resolve Concerns About Content Moderation

The definition of the relevant market for these services is much debated. But even if one were to successfully determine that a giant such as Facebook or Google was a monopolist when it came to a certain type of user-generated content service, breaking up an existing tech giant would not address concerns about content moderation and could even exacerbate the perceived problems.

There is no guarantee that a smaller or separate platform would have different content moderation strategies than the existing giants. Services typically engage in content moderation to support the environment for the type interactions they wish to serve. Smaller platforms with general audiences would still be responding to the same consumer demands regarding content. Breaking up current large platforms would not change the way platforms respond to concerns about “cancel culture” or hate speech given that existing standards support the user experience.

In some cases, breaking up larger tech giants into smaller separate companies might make addressing existing concerns about harmful content or certain actions even more difficult. Smaller platforms have fewer resources and this can make content moderation more difficult. In some cases, they may not be able to afford the tools they had access to when part of a larger company to engage in content moderation.7 They also are likely to have more limited staff to dedicate to monitoring for problematic content. The result could be that overall content moderation becomes more difficult and that more problematic content such as violence or nudity gets left up and that content gets miscategorized and wrongly taken down.

As a result, not only is antitrust action ill-advised given the current dynamics of the market, it would also fail to resolve the Big Tech critics’ concerns.

C. Expanding the Definition of Harm to Include Content Moderation Would Open Antitrust to Other Political Purposes

Even if one disagrees with the analysis of the current tech market and potential impact of a breakup on content moderation policies, there should still be a hesitancy to use antitrust enforcement to respond to concerns beyond antitrust enforcement’s traditional scope. Antitrust is a powerful tool with dramatic consequences designed to ensure consumers receive the benefits of a competitive market and it should not be used to address other policy concerns or target unpopular industries. Using antitrust to achieve policy changes that do not stem from anti-competitive behavior, such as changes to content moderation policy, would shift antitrust law away from its currently objective standard that provides certainty for businesses of all sizes. Opening up what constitutes potential violations to new concerns such as the impact of content moderation policies would create uncertainty about what behaviors are considered anti-competitive or harmful and are likely to result in enforcement actions.8 This would make antitrust far more disruptive and disturb dynamic and competitive markets. It should be particularly concerning when such proposed changes are coupled with arguments around speech.

Using antitrust enforcement regarding content moderation policy could result in such standards being abused for political purposes such as to target companies that supported political opponents or who found themselves unpopular.9

 

III. WOULD CHANGES TO SECTION 230 PROMOTE COMPETITION?

Antitrust enforcement would fail to resolve concerns about online content moderation and changing Section 230 could have a negative impact on competition. Without Section 230, new services carrying user-generated content would lack the legal certainty that could protect them from potentially business ending liability and face additional costs when it comes to starting a service.

A. Small Platforms Would Lack Resources to Weather Increased Litigation

Section 230 has been important in allowing new services hosting user-generated content to emerge with minimal barriers to entry. Without Section 230, new platforms might find themselves subject to lawsuits either from users disgruntled with their content moderation decisions or with what another user stated on their service. Since the cost of defending these suits can quickly reach tens of thousands and even hundreds of thousands of dollars,10 the result would be that large platforms may be able to afford the litigation costs associated with this increased litigation and already have well-established legal teams, but small platforms would be less likely to have these resources. As the CEO of new social networking service MeWe wrote, “The big boys have deep pockets. They can easily hire the massive moderation and legal teams that would be necessary to defend themselves. I can’t. Revoking Section 230 would put hundreds of startups and other smaller companies out of business.”11

These fears are not without merit. As Techdirt’s Mike Masnick points out, smaller hosts of third-party content already find themselves subject to litigation and the impetus behind many notable defamation cases against online services is rarely about the just the monetary compensation the defamed may receive.12

To understand what this reality might look like, one can look at examples of how other liability regimes regarding certain kinds of content impact small companies online. Under the Digital Millennium Copyright Act (“DMCA”) companies are subject to a notice and takedown requirement for copyright claims. This means a company that fails to take down or otherwise respond to claims and restrict the content that is alleged to be violating copyright following notification by the copyright owner may be sued for its failure to do so. Nearly a third of DMCA claims may prove to be questionable, but because of the lack of liability protection, services that host content about which they have received a copyright claim still have to respond.13 This results in two concerning dynamics. The first is the unnecessary removal of speech that should have been allowed due to a desire to avoid the risk of litigation. The second is that small platforms have at times found themselves bankrupted by the accompanying litigation even if their decision around the associated content was vindicated in court.14

The ability to abuse such a system and impose burdensome litigation would dramatically increase in a world without Section 230 where platforms found themselves potentially liable for any type of user-generated content.

B. Revoking Section 230 Would Reduce New Entry Into the Market and Negatively Impact Consumers

Without Section 230, consumers would have fewer choices for user-generated content hosting services as startups struggled to meet the additional costs or investors were dissuaded by the potential litigation risk for those without the most advanced tools.15 These difficulties may lead innovators to pursue other opportunities and result in fewer opportunities for speech for online users beyond existing large platforms. This impact would be true not only for potential new entrants in social media, but also a variety of other user-generated content hosts such as video game streaming, review sites, or allowing comment sections.16

Those new entrants that chose to pursue hosting user-generated content would quickly face “the moderator’s dilemma,” a choice between no moderation to ideally absolve themselves of any liability for the content or engaging in very heavy-handed moderation that risks deleting allowable speech or detracting from the ability to interact in real time.17 Without content moderation, the experience on user-generated content would be far from what most users desired as they would be more likely to encounter pornography, violence, and other distasteful content along with the material they were looking for. As a result, to keep users, many platforms would be more likely to take the latter option and engage in heavy-handed moderation, but to do so would increase the costs to small platforms. Without sufficient resources the user experience would likely be less enjoyable as users found their benign post delayed in review to ensure bad actors were caught or their legitimate content such as a bad review deleted after a complaint. Even the best content moderation system is likely to fail at some point, and these companies would still have to deal with the cost of litigation discussed above.18

Facing these struggles and the accompanying disadvantages would make it more difficult for new entrants to compete with existing and already well-resourced giants.

C. Section 230 Enables a Market for Specialized Content and Services Beyond Social Media

One advantage of Section 230 is it allows the creation of services to host user-generated content for specialized needs or smaller communities. This includes the ability of communities to connect around certain medical conditions, minority communities, and family-focused services. These communities often require content moderation tools that are more specific and addressed to the unique community’s needs. Faced with increased costs and litigation risks, it is less likely these options would be able to serve markets that fewer demand. This would limit the reach of communities that have traditionally struggled to connect and have their voices heard given more limited options.19

Removing Section 230 would also limit conversations around important issues on general use. For example, without this protection, social media platforms might be less likely to carry information about sexual harassment during the #metoo movement or allow conversations about police brutality for fear of the liability it could bring.20

This also is seen in the way Section 230 is useful beyond social media. It is difficult to imagine how a review website could survive if it faced lawsuits for negative reviews that a business disagreed with or that such sites would be useful if to avoid litigation a website had to take down any disputed review. Similarly, content hosts such as Medium would be less able to allow writers to share their thoughts without the traditional publication process if they were liable for the content of every article. This continues to many other areas that may not be initially thought of as user-generated content from user content on fitness apps such as Strava to Airbnb descriptions.21 Through these services, Section 230 has introduced new competitors into many markets and also lowered the barriers for a range of producers to share their services. As Santa Clara University law professor Eric Goldman notes, “there literally is no offline equivalent where complete strangers are comfortable enough with each other to blindly transact without doing any research on each other. That basic premise has unlocked hundreds of billions of dollars of wealth in our society (both producer surplus and consumer surplus).”22

While changes to Section 230 would impact many different services that host user-generated content, it would be most acutely felt by smaller services including ones that serve specialized audiences. The impact of these changes would limit the choices consumers have both to host their own speech as well as to access valuable information and services.

 

IV. CONCLUSION

In the current policy environment, conversations about Section 230 or other content moderation issues and antitrust often get muddled together. Aggressive antitrust enforcement seems to not reflect the current market dynamics and is unlikely to resolve the concerns that some have about content moderation decisions by large platforms. The use of antitrust to resolve content moderation concerns would set a dangerous precedent for antitrust enforcement beyond traditional measures of consumer harm.

Section 230 is not part of the problem for those concerned about today’s tech giants, but rather may hold the solution by allowing new entrants to host user-generated content with low barriers to entry. Ensuring liability protection while allowing services to determine appropriate content moderation rules for their audiences, not only allows search engines and social media, but also has increased competitors and information for many other services.

While individuals may be upset about particular content moderation decisions by certain platforms, it is important to recognize that the current framework for intermediary liability does not unfairly benefit large players and has been key to a flourishing of online voices. When considering questions of competition concerns and content moderation concerns, it is important that policymakers choose the right tools and carefully consider the consequences intervention in a dynamic and innovative market may have. History has shown that it is often hard to predict what small company may prove to be an innovative and disruptive success, but a framework including Section 230 that allows new entrants to start with minimal regulation is most likely to yield a competitive marketplace and benefit consumers.


1 Jennifer Huddleston is the Director of Technology and Innovation Policy at American Action Forum. She has a BA from Wellesley College and a JD from the University of Alabama School of Law.

2 Adam Thierer & Trace Mitchell, The Crystal Ball of Antitrust Regulators is Cracked, National Review (Jul. 21, 2020), https://www.nationalreview.com/2020/07/antitrust-regulation-rapidly-changing-marketplace-requires-humility/.

3 More Young Teens Use TikTok than Facebook, Morning Consult, https://morningconsult.com/form/more-young-teens-use-tiktok-than-facebook/; Christopher Zara, Basically Everyone is on Clubhouse Now, Fast Company (Feb. 22, 2021), https://www.fastcompany.com/90606693/basically-everyone-is-on-clubhouse-now.

4 Adam Thierer and Jennifer Huddleston, Facebook and Antitrust, Part 1: What is the Relevant Market?, The Bridge (Jun. 7, 2019), https://www.mercatus.org/bridge/commentary/facebook-and-antitrust-part-1-what-relevant-market.

5 Adam Thierer and Jennifer Huddleston, Facebook and Antitrust, Part 3: Will Structural Remedies Solve Alleged Problems? (Jun. 18, 2019), https://www.mercatus.org/bridge/commentary/facebook-and-antitrust-part-3-will-structural-remedies-solve-alleged-problems.

6 Thierer & Mitchell, supra note 2.

7 Tyler Cowen, Breaking Up Facebook Would Be a Big Mistake, Slate (Jun. 13, 2019), https://slate.com/technology/2019/06/facebook-big-tech-antitrust-breakup-mistake.html.

8 Ryan Young, Antitrust Basics: Rule of Reason standard v. Consumer Welfare Standard, Competitive Enterprise Institute (Jul. 8, 2019), https://cei.org/blog/antitrust-basics-rule-of-reason-standard-vs-consumer-welfare-standard/.

9 Ben German, DOJ Whistleblower to Allege Political Interference in Antitrust Probes, Axios (Jun. 23, 2020), https://www.axios.com/justice-department-antitrust-whistleblower-6a506915-96c1-44a5-98e1-e6f93897fc5c.html.

10 Evan Engstrom, Primer: Value of Section 230, Engine, https://www.engine.is/news/primer/section230costs.

11 Mark Weinstein, Small Sites Need Section 230 to Compete, Wall Street Journal (Jan. 25, 2021), https://www.wsj.com/articles/small-sites-need-section-230-to-compete-11611602173.

12 Mike Masnick, How Mark Warner’s SAFE TECH Act Will Make Many People a Lot Less Safe, TechDirt (Mar. 26, 2021), https://www.techdirt.com/articles/20210323/07373746473/how-mark-warners-safe-tech-act-will-make-many-people-lot-less-safe.shtml.

13 Brent Lang, Policing the Pirates: 30% of Takedowns are Questionable (Study), Variety (Mar. 26, 2019), https://variety.com/2016/film/news/movie-piracy-robots-failing-copyright-protection-1201741331/.

14 Mike Masnick, Our Comment on DMCA Takedown: Let’s Return to First Principles (and the First Amendment), TechDirt (Apr. 1, 2016), https://www.techdirt.com/articles/20160401/11332234082/our-comment-dmca-takedowns-lets-return-to-first-principles-first-amendment.shtml.

15 Derek Bambauer, What Does the Day After Section 230 Reform Look Like?, Brookings Institute, Jan. 22, 2021, https://www.brookings.edu/techstream/what-does-the-day-after-section-230-reform-look-like/.

16 CDA 230, Electronic Frontier Foundation, https://www.eff.org/issues/cda230.

17 Bobby Allyn, As Trump Targets Twitter’s Legal Shield, Experts Have a Warning, NPR (May 30, 2020), https://www.npr.org/2020/05/30/865813960/as-trump-targets-twitters-legal-shield-experts-have-a-warning.

18 Mike Masnick, Masnick’s Impossibility Theorem: Content Moderation at Scale is Impossible to Do Well, TechDirt (Nov. 20, 2019), https://www.techdirt.com/articles/20191111/23032743367/masnicks-impossibility-theorem-content-moderation-scale-is-impossible-to-do-well.shtml.

19 Billy Easley, Revising the Law that Let’s Platforms Moderate Content Will Silence Marginalized Voices, Slate (Oct. 29, 2020) , https://slate.com/technology/2020/10/section-230-marignalized-groups-speech.html.

20 Eric Goldman, Section 230 Protects Hyperlinks in #MeToo “Whisper Network” – Comyack v. Gianella, Technology & Marketing Law Blog (Apr. 28, 2020), https://blog.ericgoldman.org/archives/2020/04/section-230-protects-hyperlinks-in-metoo-whisper-network-comyack-v-giannella.htm.

21 Jennifer Huddleston, Could Messing With Internet Law Mess Up Your Vacation?, InsideSources (Aug. 29, 2019), https://www.insidesources.com/could-messing-with-internet-law-mess-up-your-vacation/.

22 Eric Goldman, Amazon is Strictly Liable for Marketplace Items, Reinforcing that Online Marketplaces are Doomed—Bolger v. Amazon, Technology & Marketing Law Blog (Sep. 8, 2020), https://blog.ericgoldman.org/archives/2020/09/amazon-is-strictly-liable-for-marketplace-items-reinforcing-that-online-marketplaces-are-doomed-bolger-v-amazon.htm.