Most Section 230 reformers are motivated by a desire to constrain the power of large technology companies, but many proposals to reform Section 230 would do the opposite. They would shift power toward large technology platforms and away from people and governments, making it even harder for startups to compete with large platforms. This paper provides a typology of these power shifts, examining the specific ways in which reform proposals could increase the power of large tech platforms. The paper then offers alternative proposals for addressing some of the concerns raised by legislators that would instead shift power in the opposite direction: away from platforms and toward people and governments.

By Matt Perault1

  1. INTRODUCTION

The CEOs of a handful of tech companies — Amazon, Apple, Facebook, Google, and Twitter — have testified repeatedly in congressional hearings over the past eight months, responding repeatedly to questions about the existence of an offensive tweet, a horrific Facebook Group, or an objectionable YouTube video. Each time, the message from legislators has been the same: you have too much power to set the rules for speech on your platform, we don’t like the rules you set, and we don’t like how you enforce them. Democrats argue that tech companies censor too little, that harmful speech is allowed to run rampant on their platforms, posing particular harm to vulnerable communities. Republicans argue that tech companies censor too much, that left-leaning executives and employees develop content moderation practices that disproportionately restrict conservative speech.

But regardless of political party, for nearly all legislators, the culprit is clear: Section 230 of the Communications Decency Act. Enacted into law in 1996, Section 230 ensures that online content hosts aren’t liable for the speech of content creators.2 Its supporters refer to it as the internet’s Magna Carta.3 Its critics argue that it is a harmful relic of a previous era.4

In the last several months, legislators have gone on a Section 230 reform spree, as documented in the Section 230 Legislative Tracker that launched in March, a partnership between the Center on Science & Technology Policy at Duke, Future Tense, and the Washington College of Law at American University.5 The Tracker includes information on each Section 230 reform proposal introduced in the last Congress and the current one, including bill name, co-sponsors, status, reform type, and a summary of the proposal.

Although nearly all efforts to reform Section 230 appear to be motivated by a desire to constrain the power of large technology companies, many of the proposals would likely do the opposite. They would shift power toward large technology platforms and away from people and governments, making it even harder for startups to compete for users and advertisers and more deeply entrenching the market position of large tech platforms.

The purpose of this paper is to provide a typology of these power shifts, examining the specific ways in which reform proposals could increase the power of large tech platforms. The paper also offers alternative proposals for addressing some of the concerns raised by legislators that would instead shift power in the opposite direction: away from platforms and toward people and governments. Ultimately, putting more power in the hands of people (to control their experiences on the tech products they use) and governments (to strike the right balance in public welfare tradeoffs) will produce a better internet.

 

  1. HOW REFORMS COULD INCREASE PLATFORM POWER

Several proposed reforms of Section 230 could increase the power of large tech platforms by creating stronger incentives to censor content, raising barriers to entry that reduce competition, reducing competition on quality and innovation, and aiding large news publishers at the expense of news cost and quality.

  1. Shifting Power from People to Platforms: Incentivizing Censorship

Several proposed reforms will likely lead to increased censorship, with platforms playing a stronger intermediary role and serving more as gatekeepers to expression rather than as facilitators of it. Any Section 230 reform that subjects platforms to more suits, that extends the duration of suits, or that results in more prohibitively-costly verdicts for plaintiffs is likely to create strong incentives for platforms to limit user expression. When people have their speech removed or are deterred from speaking in the first place, power shifts from people to platforms.

A number of proposed reforms would likely increase litigation costs sufficiently to create these strong censorship incentives. For example, a full repeal of Section 230 — as several members of Congress, Trump, and Biden have proposed6] — would subject platforms to increased legal liability. But even more modest reforms could have a similar impact on litigation risk. Making liability protections contingent upon a showing of reasonableness, as some experts have proposed,7 would necessitate litigating the question of whether a platform’s behavior was reasonable. This change would mean that platforms couldn’t use Section 230 to dismiss a case at the motion to dismiss phase of litigation, before defendants are forced to bear the expensive burdens of discovery. Similarly, “quid pro quo” approaches — where a platform must demonstrate that it has complied with certain substantive or procedural best practices8 – would make it difficult to resolve cases before discovery.

Platforms that host user-generated content would thus need to bear heightened litigation costs, both because litigation costs rise as litigation becomes lengthier and also because prospective plaintiffs may have more incentives to sue in the first place. And if content becomes more expensive, platforms will take steps to limit those costs, either by removing content more aggressively or by changing their products to make it harder for users to speak and share.

Increased liability could also shift power from users to platforms in more subtle ways. When it becomes more expensive to host content, platforms might also seek to deploy technical tools to differentiate between low-risk content and high-risk content, since technology may be preferable to human review as a means of addressing risk at scale. Increasing the types of content that are subject to review by artificial intelligence systems puts more power in the machines deployed by platforms and advantages platforms that are able to invest in building those systems. Similarly, a shift toward technology-oriented, centralized moderation models might make it less likely that platforms could rely on community moderation models like Reddit’s,9 meaning that users would not only experience a reduced capacity to speak, but also a reduced capacity to self-govern.

  1. Shifting Power from People to Platforms: Raising Barriers to Entry

Any reform that has the effect of significantly increasing the cost of user-generated content business models will not only create incentives for censorship, but will also likely introduce barriers that will make it more difficult for smaller platforms to compete with larger ones. Platforms like Amazon, Apple, Google, and Facebook have huge user bases, but also employ tens of thousands of people.10

If litigation costs increase, large companies will have the ability to choose whether to maintain their current models in the face of these new expenses or change their products to decrease potential costs. A company like Facebook or Google could decide to bear the risk of increased costs if doing so protects their market position and makes it more difficult for other companies to compete. Competing platforms with fewer resources are unlikely to have the luxury of choice.

One example is the SESTA/FOSTA legislation that reformed Section 230 in an attempt to combat sex trafficking. The law not only failed to protect the community it was designed to protect,11 but it also forced smaller, less well-resourced dating sites to shut down. Facebook entered the online dating market shortly afterward.12

For smaller companies, prohibitive compliance costs could come not just from increased litigation expenses, but also from mandating more extensive operational procedures, such as reporting channels, appeals mechanisms, and transparency reports. For instance, the Platform Accountability and Consumer Transparency (“PACT”) Act proposes several operational requirements for platforms, several of which are already common current practice for large tech platforms.13 In Europe, the Digital Services Act proposes similar requirements.14

But for smaller platforms, many of these operational standards are prohibitively burdensome, and diverting employee capacity away from engineering and sales and into legal and policy compliance functions could make it even more difficult for them to compete with established platforms.15

It’s also not surprising that larger platforms have endorsed reforms that could entrench their market position. Facebook CEO Mark Zuckerberg has expressed support for “quid pro quo” reforms that require companies to implement certain operational procedures in order to earn Section 230 protections.16 Facebook already has most of these procedures in place, and has the scale and resources to adopt additional procedures that Congress might require, but new operational requirements may prove burdensome for smaller providers that are seeking to compete with Facebook.

Finally, reform proposals that remove protections for hosting advertising are likely to erect barriers to entry in the online advertising market. The SAFE TECH Act, for instance, includes a carve-out from Section 230 protections for paid speech.17 The bill laudably seeks to reduce advertising fraud, but would increase compliance costs for platforms seeking to build advertising systems to rival Google, Facebook, and Amazon’s. That challenge is already difficult, and increasing litigation risks and operational burdens will make it even harder to compete.

  1. Shifting Power from People to Platforms: Reducing Competition on Quality and Innovation

Some reforms could shift power from people to platforms by stifling competition on quality and innovation. For instance, TikTok has grown rapidly as a rival to Facebook, Google, and Twitter in large part due to the strength of its algorithm.18 But the space for algorithmic competition may narrow if proposals like the Protecting Americans from Dangerous Algorithms Act become law. The bill removes Section 230 as a defense in civil rights and terrorism cases where algorithms are used to sort and distribute content to users, but provides exemptions if platforms offer the ability to sort content chronologically.19 If relying on algorithmic ranking could cause platforms to lose Section 230 protections, platforms may pivot to a heavier reliance on chronological sorting, which could reduce a competitive advantage for companies like TikTok.

Platforms also use Section 230 protections to compete on content moderation. For instance, while Twitter blocked sharing of the controversial New York Post story about Hunter Biden in October 2020, Facebook merely downranked it in News Feed.20 Parler quickly became appealing to conservatives because of a more lenient approach to moderation.21 Similarly, TikTok has sought to distinguish itself by offering more transparency about its algorithm and its moderation practices.22 Different approaches to content moderation enable users to make choices based on their moderation preferences.

Several proposals threaten this competition. Some bills would narrow the “Good Samaritan” provisions of Section 230, which provide broad protections to platforms when they moderate content,23 and others impose a neutrality requirement on platforms, reducing their freedom to set their own moderation policies.24 If platforms face increased liability when they take steps to moderate content or if they must receive a certification of neutrality from a government agency, they will have less room to compete on the quality of their content moderation offerings.25

  1. Shifting power from people to large platforms and publishers: Antitrust exemptions for news publishers

Some reformers also claim that Section 230 creates an unequal playing field between platforms and news publishers, since news publishers face liability for defamation and libel while platforms can use Section 230 as a defense in such cases.26 They claim that this imbalance gives platforms an unfair advantage in competition for user attention and advertising. In response, legislators have proposed exempting news publishers from antitrust laws to allow them to collude in negotiations with tech platforms.27

Antitrust exemptions are disfavored as a policy tool because antitrust law exists to ensure that consumers are protected from harm.28 By definition, exemptions from a body of law designed to protect consumers would immunize companies when they engage in conduct that produces harm, and would therefore increase the likelihood that companies will behave in ways that harm consumers.

Antitrust exemptions for the news industry could harm consumers by increasing the costs of consuming news, decreasing the supply of news, or reducing the quality of news. An antitrust exemption would also likely benefit large publishers at the expense of smaller ones, since larger publishers would likely be well-positioned to bargain for licensing agreements with platforms while smaller publishers might not be able to secure similar deals.29 Finally, an antitrust exemption would benefit large platforms over smaller platforms, since companies with deep pockets could afford to compensate publishers for their content, to divert engineering resources to product changes demanded by publishers, and to devote time and energy to the process of negotiating agreements.

 

III. RECOMMENDATIONS FOR SHIFTING POWER TO PEOPLE AND GOVERNMENTS

Aspiring reformers who believe that governments – and not companies – should set the regulatory agenda in tech policy or who believe that users should be more empowered to control their experience on tech products should consider an alternative reform agenda. To shift power away from large platforms and toward people and governments, reformers should consider the following proposals.

  1. Reforms that shift power from platforms to people
  2. Algorithmic Choice

Several reforms would shift power from platforms to people, giving users more control over the content they see in their tech products. One obvious example is enhancing algorithmic choice so as to provide people with more choices about the content that is surfaced to them. Facebook recently announced that it will offer a “Feed Filter Bar” to make it easier for people to switch between an algorithmic feed, a chronological feed, and a new “Favorites” feed, and Twitter CEO Jack Dorsey proposed offering people the ability to select third-party algorithms.30

That approach is similar to a proposal made by my Duke colleague Barak Richman and Stanford professor Francis Fukuyama in recent policy brief.31 The proposal would require tech platforms to offer APIs that integrate with “middleware” algorithm services such that consumers could choose their preferred algorithms to sort their news, rank searches, and order their tweets.

Of course, platforms could also offer more consumer choice in their own products, even in the absence of third-party algorithms. Platforms could offer users a menu of options to set more individualized content preferences, and offer more specialized feeds like “political junkie,” “sports fan,” or “pet lover.”

  1. Transparency

Many platforms have supported increased transparency about their content practices. Many platforms publish transparency reports that detail the volume and types of content they remove. Several platforms have sought to provide more information to users about why they see certain ads or content, both by expanding the content they offer in help centers and by experimenting with in-line context and labeling.32

But despite these efforts at transparency – most of which exceed transparency in other industries33 – it is still difficult to get the granular data that would illuminate where platforms are falling short in their content moderation practices and which interventions would be most successful in addressing user concerns. To understand those dynamics, researchers must have better access to data.

Currently, researchers fear that they may be prosecuted under the Consumer Fraud and Abuse Act if they try to obtain company data, and companies fear that they may be held liable if they share data with researchers who subsequently misuse it.34 In the wake of data sharing controversies, platforms have taken steps to reduce third-party data access.35 To facilitate the study of content moderation, policymakers must develop a more sensible regulatory regime for data sharing.

Congress or the FTC could make it easier for researchers to access data by immunizing researchers who obtain company data consistent with a company’s terms and by immunizing platforms that provide data to researchers consistent with privacy and security best practices. Congress could achieve this result by including safe harbors for researcher data access in federal privacy legislation or as a standalone reform. Alternatively, the FTC could issue a policy statement indicating that it will not pursue enforcement action when researchers and platforms share data consistent with security and privacy safeguards.

  1. User Control and Competition

Legislators might also consider reforms that bolster competition by putting more power in people’s hands, lowering barriers to entry, and spurring innovation. One example is to encourage more data portability, which is the ability to take data from one service to another. Mandatory data portability has gained support from industry and a bipartisan group of lawmakers.36 By enabling people to move their data more easily from one product to another, it reduces the ability of a platform to “lock in” users even when higher-quality or lower-priced options are available elsewhere in the market.

  1. Reforms that Shift Power from Platforms to Government
  2. Modernizing Federal Election Law

In some circumstances, it may be preferable to put decision-making power in government hands, rather than expecting platforms to make decisions about critical aspects of governance. One example is election regulation. During the 2020 election, advocacy organizations and news reports suggested that tech platforms should play a more aggressive role in policing voter suppression.37 In response, several platforms announced measures to remove, downrank, or contextualize content aimed at suppressing the vote.38

But governments, not private companies, should set the rules for elections. Extensive federal and state laws exist to govern the process and an entire federal agency, the Federal Election Commission, is dedicated to developing and enforcing the law. In light of this existing regulatory backdrop, it seems odd to ask tech companies to play a primary role in regulating election speech.39

Rather than gutting Section 230 in the hopes of policing election speech, Congress should modernize voting law by passing new criminal law on deceptive practices in voting, prohibiting speech that intentionally suppresses the vote. While some states have laws that prohibit deceptive practices, no equivalent federal law exists. The Deceptive Practices and Voter Intimidation Prevention Act – which was first introduced in 2006 and is included in the For the People Act of 2021 that recently passed the House40 – is one way to achieve this objective. To increase the chances of garnering bipartisan support, the legislation could also include additional protections against voting fraud.

This law would need to be narrowly crafted to survive a challenge under the First Amendment, but that challenge would not necessarily be fatal. For instance, the Supreme Court has upheld laws restricting election-related speech when those laws are needed to “protect[] voters from confusion and undue influence” and to “ensur[e] that an individual’s right to vote is not undermined by fraud in the election process.”41

Modernizing federal law on voting would serve several purposes. First, because Section 230 cannot be used as a defense against claims based on federal criminal law, a new law would eliminate Section 230 as a bar in voter suppression cases, as long as they are brought under federal law.

Second, even in cases where platforms were not criminally liable, new law would enable prosecutors to pursue cases against individuals who use online platforms to engage in deceptive practices that suppress the vote. The risk of prosecution will likely deter some people from engaging in deceptive practices, reducing the volume of problematic content that harms the election process.42

Third, federal criminal law on voter suppression would give platforms a basis for cooperating with law enforcement to prosecute voter suppression. Currently, platforms regularly provide data in response to requests they receive from law enforcement.43 But in order to provide data in these cases, a platform must receive a lawful request, and in the absence of law criminalizing the conduct, a platform has no basis for providing data to a law enforcement body. With new law, the government could request relevant data held by platforms, and platforms would have a lawful basis for complying with those requests.

  1. Modernizing Federal Criminal Law on Incitement to Riot

The issue of online incitement to riot has come into sharper focus in the wake of the January 6th attack on the Capitol, with allegations that the attacks were facilitated by coordination that occurred on tech platforms.44 These allegations have been accompanied by calls for tech companies to be more aggressive in policing content that might incite violence.45

To address these concerns, Congress should modernize federal criminal law on incitement to riot. There is existing law on incitement, but it was passed in 1968, before the rise of the internet, mobile technologies, and social media, and appeals courts have found some of its provisions to be unconstitutional.46 The law is outdated and in need of reform.

As with new federal criminal law in voter suppression, passing new criminal law on incitement would bar Section 230 from being used as a defense in cases brought pursuant to that law. And as with other criminal law, it would likely have a deterrent effect and would provide a basis for platforms to disclose data to law enforcement to assist with investigations.

Passing new law in this area would shift power toward legislators and judges in determining what constitutes incitement and what does not, and would put platforms in the position of complying with government rules rather than setting the rules on incitement themselves.

  1. Facilitating Off-platform Resolution of Disputes

While Section 230 immunizes platforms for certain speech created by others, it doesn’t immunize anyone — platforms or users — for the speech they create. Policymakers could shift power away from platforms and toward governments and people by creating new reporting flows that would make it easier to hold people accountable and resolve disputes.

Platforms already use reporting flows for this purpose. Some platforms recommend that users report certain behavior to law-enforcement officials.47 Platforms also sometimes facilitate reporting to trusted members of a community, such as in bullying cases where a potential victim might prefer to report an incident to a teacher instead of to a platform or to law enforcement.48

Similar design features might make it easier for users to resolve disputes and easier for governments to enforce existing law. For instance, platforms could provide functionality that enables people to report content not only to the platform, but also to the offices of state attorneys general.49 Alternatively, platforms could provide options to report false voting information to an election-monitoring organization, to report harassment to victims’ support services, or to report defamation to lawyers who specialize in defamation law.

  1. Institutionalizing the Study of Power: Regulatory Curiosity and Section 230 Reform

No matter what reforms we pursue, their effects will be uncertain. We may embark on reform that intends to shift power from platforms to people,50 and find that it has the opposite effect. Reforms that aim to shift power from platforms to governments may instead aggregate power in platforms’ hands, just as some argue that privacy reform in Europe has dug a moat around large platforms’ market position.51

The uncertainty should be humbling. Outcomes that appear certain may be anything but. In the face of this uncertainty, an agenda for reforming Section 230 should focus not only on substance, but also on process. Regulators should implement reforms that institutionalize learning, providing them with information about the reform agenda’s performance and creating opportunities for intervention to change course. In the absence of certainty, we should encourage regulatory curiosity.

Regulations have several options for institutionalizing curiosity. For instance, they could deploy regulatory sandboxes, a policy tool that trials a novel approach for a fixed period of time, uses that period to gather detailed data about the performance of the regime, and then implements changes to reflect learnings during the trial period.52

Due to the potential power implications of Section 230 reform, regulators should use a sandbox to monitor how power dynamics change in response to reform. For instance, regulators should work with researchers to develop metrics for evaluating user empowerment, barriers to entry, and product quality so that they can monitor the progress of reforms against these benchmarks.

These metrics might be particularly relevant for certain reforms, such as data portability. Portability has the potential to make it easier for people to leave established platforms for smaller ones, but it may instead entrench large platforms by giving them access to data at competing startups. It’s also possible that it could constitute a step backward for user empowerment, if it enables people to transfer data to less secure services or if it makes it easier to transfer data that is relevant to you but that you don’t control, such as a photo that you are tagged in. With so much unknown, a regulatory sandbox on portability would help us to learn which policies work best and whether they bring us closer to our policy objectives.

 

  1. CONCLUSION

According to its proponents, Section 230 reform is necessary to reduce the power of large technology companies, to level the playing field with other industries, and to empower users. But most proposals to reform Section 230 would do the opposite, aggregating power in the hands of large platforms.

To empower users and governments, rather than large platforms, reformers should consider an alternative set of proposals focused on increasing user choice, promoting competition, and situating the government as the primary decision-maker for setting the rules that govern online behavior. To ensure that the reforms we enact achieve the objectives we desire, we should institutionalize “regulatory curiosity” so that we can learn from reform efforts and improve them over time.

1 Matt Perault is the Director of the Center on Science & Technology Policy at Duke University. He previously served as a director of public policy at Facebook. He led the company’s global public policy planning efforts on issues such as competition, law enforcement, and human rights and oversaw public policy for WhatsApp, Oculus, and Facebook Artificial Intelligence Research. Prior to joining Facebook, Matt was Counsel at the Congressional Oversight Panel. Matt holds a law degree from Harvard Law School, a Master’s degree in Public Policy from Duke University’s Sanford School of Public Policy, and a Bachelor’s degree in political science from Brown University.

2 47 U.S.C. § 230.

3 A digital Magna Carta, QUARTZ (December 2, 2020), https://qz.com/emails/quartz-obsession/1940615/.

4 Issie Lapowsky & Emily Birnbaum, Mark Warner is ready to fight for Section 230 reform, PROTOCOL (March 22, 2021), https://www.protocol.com/policy/mark-warner-section-230.

5 Kiran JeevanJee et. al., All the Ways Congress Wants to Change Section 230, SLATE (March 23, 2021) https://slate.com/technology/2021/03/section-230-reform-legislative-tracker.html.

6 Kiran Jeevanjee et. al., All the Ways Congress Wants to Change Section 230, SLATE (March 23, 2021) https://slate.com/technology/2021/03/section-230-reform-legislative-tracker.html. President Biden suggested repealing Section 230 as a candidate, but has not announced a recommendation for reforming Section 230 since he became president. Makena Kelly, Joe Biden Wants to Revoke Section 230, THE VERGE (January 17, 2020), https://www.theverge.com/2020/1/17/21070403/joe-biden-president-election-section-230-communications-decency-act-revoke.

7 Danielle Keats Citron & Benjamin Wittes, The Internet Will Not Break: Denying Bad Samaritans § 230 Immunity, 86 Fordham L. Rev. 401 (2017).

8 Stigler Committee on Digital Platforms, Final Report (2019), https://www.publicknowledge.org/wp-content/uploads/2019/09/Stigler-Committee-on-Digital-Platforms-Final-Report.pdf.

9 Spandana Singh, Everything in Moderation (July 22, 2019), https://www.newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-artificial-intelligence-moderate-user-generated-content/case-study-reddit.

10 Amazon has more than 1 million employees, Apple has more than 145,000, Google has more than 135,000, and Facebook has more than 50,000. In contrast, Twitter has roughly 4,500 employees, Snap has fewer than 3,000 employees, and Reddit has roughly 700 employees.

11 Rose Conlon, Sex workers say anti-trafficking law fuels inequality, MARKETPLACE (April 30, 2019), https://www.marketplace.org/2019/04/30/sex-workers-fosta-sesta-trafficking-law-inequality/.

12 Elliot Harmon, Changing Section 230 Would Strengthen the Biggest Tech Companies, THE NEW YORK TIMES (October 16, 2019), https://www.nytimes.com/2019/10/16/opinion/section-230-freedom-speech.html?referringSource=articleShare.

13 Platform Accountability and Consumer Transparency Act, S. 797 (introduced March 17, 2021), https://www.congress.gov/117/bills/s797/BILLS-117s797is.pdf.

14 European Commission, The Digital Services Act: ensuring a safe and accountable online environment, https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en.

15 It is therefore not surprising that a coalition of smaller tech platforms responded to the PACT Act by warning that reform could “unintentionally harm smaller sites and organizations.”

16 Written testimony of Mark Zuckerberg, Hearing Before the United States House of Representatives Committee on Energy and Commerce Subcommittees on Consumer Protection & Commerce and Communications & Technology (March 25, 2021), https://energycommerce.house.gov/sites/democrats.energycommerce.house.gov/files/documents/Witness%20Testimony_Zuckerberg_CAT_CPC_2021.03.25.pdf.

17 Senator Mark Warner, Warner, Hirono, Klobuchar Announce the SAFE TECH Act to Reform Section 230 (February 5, 2021), https://www.warner.senate.gov/public/index.cfm/2021/2/warner-hirono-klobuchar-announce-the-safe-tech-act-to-reform-section-230.

18 Ben Thompson, The TikTok War, STRATECHERY (July 14, 2020), https://stratechery.com/2020/the-tiktok-war/.

19 Representative Tom Malinowski, Reps. Malinowski and Eshoo introduce bill to hold tech platforms liable for algorithmic promotion of extremism (October 20, 2020), https://malinowski.house.gov/media/press-releases/reps-malinowski-and-eshoo-introduce-bill-hold-tech-platforms-liable-algorithmic.

20 Shannon Bond, Facebook And Twitter Limit Sharing ‘New York Post’ Story About Joe Biden, NPR (October 14, 2020), https://www.npr.org/2020/10/14/923766097/facebook-and-twitter-limit-sharing-new-york-post-story-about-joe-biden.

21 Shannon Bond, Conservatives Flock To Mercer-Funded Parler, Claim Censorship On Facebook And Twitter, NPR (November 14, 2020).

22 Vanessa Pappas, TikTok to launch Transparency Center for moderation and data practices, TIKTOK NEWSROOM (March 11, 2020), https://newsroom.tiktok.com/en-us/tiktok-to-launch-transparency-center-for-moderation-and-data-practices.

23 Limiting Section 230 Immunity to Good Samaritans Act, S. 3983 (introduced June 17, 2020), https://www.congress.gov/bill/116th-congress/senate-bill/3983/text?r=6&s=1.

24 Ending Support for Internet Censorship Act, S. 1914 (introduced June 19, 2019), https://www.congress.gov/bill/116th-congress/senate-bill/1914/text.

25 Evelyn Douek has written extensively about the subtle implications of cooperation and competition between providers on content moderation. For instance, see Evelyn Douek, The Rise of Content Cartels, KNIGHT FIRST AMENDMENT INSTITUTE (February 11, 2020), https://knightcolumbia.org/content/the-rise-of-content-cartels.

26 This claim is misleading, since publishers benefit from Section 230 protections when they act as an interactive computer service and host user comments, and platforms cannot use Section 230 as a defense when they act as an information content provider and create content.

27 Senator Amy Klobuchar, Senator Klobuchar and Representative Cicilline Introduce Legislation to Protect Journalism in the United States (March 10, 2021), https://www.klobuchar.senate.gov/public/index.cfm/2021/3/senator-klobuchar-and-representative-cicilline-introduce-legislation-to-protect-journalism-in-the-united-states#:~:text=The%20Journalism%20Competition%20and%20Preservation%20Act%20only%20allows%20coordination%20by,discriminatory%20to%20other%20news%20publishers.

28 See, e.g. Antitrust Modernization Commission, Report and Recommendations (April 2007), https://govinfo.library.unt.edu/amc/report_recommendation/amc_final_report.pdf.

29 The agreements that Google and Facebook have struck in Australia, for instance, suggest that negotiations would likely focus on large deals with large publishers, and leave smaller publishers at a competitive disadvantage relative to others in the publishing industry. See, e.g. News Corp and Google Agree To Global Partnership On News, NEWS CORP (February 17, 2021), https://newscorp.com/2021/02/17/news-corp-and-google-agree-to-global-partnership-on-news/.

30 Written testimony of Jack Dorsey, Hearing Before the United States Senate Committee on Commerce, Science, and Transportation (October 20, 2020), https://www.commerce.senate.gov/services/files/7A232503-B194-4865-A86B-708465B2E5E2.

31 Francis Fukuyama, Barak Richman, et. al., Middleware for Dominant Digital Platforms: A Technological Solution to a Threat to Democracy, STANFORD UNIVERSITY, https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/cpc-middleware_ff_v2.pdf. The authors made similar arguments in subsequent articles in Foreign Affairs and The Wall Street Journal.

32 Why you’re seeing an ad, GOOGLE ACCOUNT HELP, https://support.google.com/accounts/answer/1634057?hl=en#:~:text=Your%20activity%3A&text=Your%20previous%20interactions%20with%20ads,Your%20activity%20on%20another%20device. Facebook Newsroom, TWITTER (April 7, 2021), https://twitter.com/fbnewsroom/status/1379887135998763014?s=20. Providing information alongside a post enables people to get more context at the same time as they are seeing the content, while also connecting them directly to controls that enable them to change the content they receive.

33 Cf. Transparency Report, AT&T (February 2021), https://about.att.com/content/dam/csr/2019/transparency/2021/2021-February-Report.pdf.

34 FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook, FEDERAL TRADE COMMISSION (July 24, 2019), https://www.ftc.gov/news-events/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions.

35 A Platform Update, FACEBOOK NEWSROOM (July 2, 2018), https://about.fb.com/news/2018/07/a-platform-update/.

36 Data Transfer Project (https://datatransferproject.dev/); Representative Ken Buck, The Third Way, https://buck.house.gov/sites/buck.house.gov/files/wysiwyg_uploaded/Buck Report.pdf; Senator Mark Warner, Senators Introduce Bipartisan Bill to Encourage Competition in Social Media (October 22, 2019), https://www.warner.senate.gov/public/index.cfm/2019/10/senators-introduce-bipartisan-bill-to-encourage-competition-in-social-media.

37 Investigate Facebook: Profiting off voter suppression, https://actionnetwork.org/petitions/investigate-facebook-profiting-off-voter-suppression/.

38 Twitter Safety, Expanding our policies to further protect the civic conversation, TWITTER BLOG (September 10, 2020), https://blog.twitter.com/en_us/topics/company/2020/civic-integrity-policy-update.html.

39 It is even more puzzling because many of the same people calling for platforms to be more aggressive in censoring speech during elections also advocate for more constraints on platform power.

40 For the People Act (January 4, 2021) https://www.congress.gov/bill/117th-congress/house-bill/1/text.

41 Burson v. Freeman, 504 U.S. 191 (1992).

42 Five Things about Deterrence, NATIONAL INSTITUTE OF JUSTICE, (June 5, 2016), https://nij.ojp.gov/topics/articles/five-things-about-deterrence#one.

43 See, e.g. Google Transparency Report, GOOGLE, https://transparencyreport.google.com/?hl=en.

44 Katie Paul, et. al., Analysis: Facebook and Twitter crackdown around Capitol siege is too little, too late, REUTERS (January 8, 2021), https://www.reuters.com/article/uk-usa-election-hate-analysis/analysis-facebook-and-twitter-crackdown-around-capitol-siege-is-too-little-too-late-idUKKBN29D2WA?edition-redirect=uk.

45 Adam Satariano, After Barring Trump, Facebook and Twitter Face Scrutiny About Inaction Abroad, NEW YORK TIMES (January 14, 2021), https://www.nytimes.com/2021/01/14/technology/trump-facebook-twitter.html.

46 Josh Gerstein, Court pares back federal Anti-Riot Act, POLITICO (August 24, 2020), https://www.politico.com/news/2020/08/24/court-pares-back-federal-anti-riot-act-400999.

47 Making it easier to report threats to law enforcement, TWITTER BLOG (March 17, 2015), https://blog.twitter.com/en_us/a/2015/making-it-easier-to-report-threats-to-law-enforcement.html; What should I do if someone is asking me to share non-consensual intimate images with them over Messenger?, FACEBOOK HELP CENTER, https://www.facebook.com/help/messenger-app/android/330915461156611?helpref=uf_permalink.

48 Dustin Petty, Facebook Fights Bullying, (April 2, 2012), https://news.jrn.msu.edu/bullying/2012/04/02/facebook-fights-bullying/.

49 For an example of existing systems that state attorney general’s use for receiving complaints, see the North Carolina Attorney General’s website. https://ncdoj.gov/file-a-complaint/.

50 For example, SESTA/FOSTA legislation was designed to protect sex workers and has instead made them more vulnerable. Ted Andersen et. al., The Scanner: Sex workers returned to SF streets after Backpage.com shut down, SAN FRANCISCO CHRONICLE (October 15, 2018).

51 Nick Kostov & Sam Schechner, GDPR Has Been a Boon for Google and Facebook, WALL STREET JOURNAL (June 17, 2019), https://www.wsj.com/articles/gdpr-has-been-a-boon-for-google-and-facebook-11560789219.

52 Regulatory sandboxes have been used widely in other countries. A journey through the FCA regulatory sandbox, DELOITTE, https://www2.deloitte.com/content/dam/Deloitte/uk/Documents/financial-services/deloitte-uk-fca-regulatory-sandbox-project-innovate-finance-journey.pdf.