Europe is currently experiencing a renewed raft of social media regulations with the newly adopt Digital Services Act. This is significant because it demonstrates the European Union further intervening into the technology and digital arena. This Europeanisation of digital services legislation is muscular and sets out significant provisions for social media companies to be sanctioned for non-compliance and presents a range of issues for social media companies. In addition, the measures are unlikely to be a “silver bullet” solution to the range of problems presented by social media platforms. This intervention comes within a European context where American big tech has been blamed for many contemporary political and social ills, including fueling the rise of extremist politics and spreading disinformation in the context of the COVID-19 pandemic.

By Dr. Joseph Downing[1]

 

Debates about the divergent demands of freedom of expression on one hand and the need to regulate social media on the other have been reinvigorated in the past year with Elon Musk’s acquisition of Twitter and the contending opinions of whether it will improve freedom of speech and transparency as he has promised,[2] or whether it will turn twitter into an “extremist ghetto” by offering a space for radical and xenophobic views.[3] However, little in the broader public, media or political discourse has considered that this promise is not necessarily in Musk’s hands because neither he, nor Twitter, nor social media more generally, exist in a vacuum. National governments, and more recently, transnational governments, are increasingly seeking to regulate, and when required, impose sanctions on social media companies.

Europe is currently experiencing a renewed raft of social media regulations with the newly adopt Digital Services Act. This is significant because it demonstrates the European Union further intervening into the technology and digital arena. This Europeanisation of digital services legislation is muscular and sets out significant provisions for social media companies to be sanctioned for non-compliance and presents a range of issues for social media companies. In addition, the measures are unlikely to be a “silver bullet” solution to the range of problems presented by social media platforms. This intervention comes within a European context where American big tech has been blamed for many contemporary political and social ills, including fueling the rise of extremist politics[4] and spreading disinformation in the context of the COVID-19 pandemic.[5]

 

I. THE DIGITAL SERVICES ACT: KEY PROVISIONS

The Digital Services Act makes a range of provisions for the regulation of technology companies. These rules emerge in response to the rapid and widespread growth of digital services that further intrude into citizens and consumers daily lives. Against this context, the EU’s intervention the “Digital Services Act” that aims to “create a safer digital space where the fundamental rights of users are protected”[6] and to “establish a level playing field to foster innovation, growth and competitiveness.”[7] The scope of the Digital Services Act is vast. The rules specified by the act focus primarily on online intermediaries and platforms, which cover a huge area of online activity including marketplaces, social networks, and content sharing platforms in addition to “gatekeeper online platforms” that sit between businesses and consumers.[8] However, this article will focus on the potential issues that the Digital Services Act presents with a democratic deficit, the difficult nature of digital content moderation, and its inability to account for the agility of extremists to migrate to new platforms.

 

II. EUROPEANISATION AND THE DEMOCRATIC DEFICIT CREATED BY THE DIGITAL SERVICES ACT

The Digital Services Act is the landmark provision of the European Commission to regulate a range of digital services in the European space. However, given the multi- and trans-national nature of both the digital economy and the companies which operate within it, the bill effects digital services provision globally. Indeed, the European Commission openly promotes the Digital Services Act as having regulatory importance “both in the European Single Market and globally.”[9] However, this fails to mention one of the key, and highly problematic aspects, of the Digital Services Act that gets directly at current debates about social media and free speech.

This is because the bill itself demonstrates that the transnational legislative ability of the European Commission can be subverted to pass legislation that is defeated at the national level. Here, the Digital Services Act demonstrates a questionable angle to the process of Europeanisation. Similar legislation was defeated by France’s Supreme Court as posing a significant risk to freedom of expression.[10] This time around, the legislation has been heavily pushed by Macron as part of his assertion of the French position in Europe.[11] Macron using European legislative institutions as a means to promote and adopt regulations defeated by his own supreme court is problematic as a key principle of EU membership is that member states legislation follows EU legislation[12] and thus France will get regulations pushed onto it from above that it rejected at the member state level.

 

III. MODERATING CONTENT: UNPRECEDENTED OVERSIGHT AND OPERATIONAL ISSUES

A key provision of the Digital Services Act rests in the creation of European wide content moderation mechanisms that are separate from social media companies and thus gives the European Commission unprecedented oversight on what is, or is not, permissible discourse on social media. While this in itself is problematic, a further issue with this ambitious take on content moderation comes in the implementation phase of the legislation. This relates to a much broader set of issues related to all legislation and policy. This is the unpredictable process of implementation and operationalization. Thus, it is straightforward to promise a “safer digital space” and to “safeguard users rights” but far more difficult to actually deliver on such promises.

A key aim of the package is to tackle issues online with the spread of illegal content and misinformation. This has been a significant problem for some time, but two key issues emerge here. Firstly, the freedom of speech implications for imposing Europe wide standards on what is “illegal” content, decided on by unelected bureaucrats in Brussels sets a dangerous precedent. It was upon these grounds that the French supreme court defeated very similar measures formulated by the Macron government. The rules set out a framework for platforms to work with specialized “trusted flaggers”[13] to identify and remove content. However, training, retaining and the grounds upon which one will be “trusted” are ambiguous and reproduces many of the issues that platform moderation has already been criticized for in being unaccountable and expensive.[14] Indeed, the potential commercial burden for social media companies is enormous, and even the maximum fines of 6 percent of operating profits[15] (although actual fines are likely to be much smaller) could be seen as cheaper, and factored in as a business cost. This is not to mention the huge toll content moderation takes on human workers,[16] something which is likely to prove extremely problematic in terms of staff training and retention, as well as staff wellbeing.

Complementing humans with algorithms and AI seem a “safer” and logical alternative. These algorithms have been criticized in the past for being too opaque and lacking transparency[17] and actually missing harmful content[18] because of the complex nuances of the text, image, video and audio-based nature of the social media landscape. The Digital Services Act specifies that these should be made transparent.[19] Again, this is not as straightforward as it may seem: algorithms also sort content to generate the revenue social media outlets need to survive,[20] and thus they are extremely commercially sensitive. Platforms invest huge amounts of money in the human and machine infrastructure to generate these complex models and are highly unlikely to be willing to openly offer up their trade secrets.

 

IV. PLATFORM MIGRATION AND GETTING AROUND THE DIGITAL SERVICES ACT

The Digital Services Act sets out an extremely ambitious scope for the legislation to regulate a huge number of independent and international entities.

A final key issue that could significantly limit the effectiveness of the new legislation in its ability to combat fake news and hate speech comes from the remarkable agility of users themselves. Social media regulation and platform censorship aimed at taking down violent or hateful content is nothing new. However, users have shown significant agility to get around these attempts through platform migration. Both ISIS[21] and alt-right and conspiracy theory influencers[22] have demonstrated this by simply side-stepping censorship attempts and moving to apps like Telegram. The fact that many conspiracy theories thrive on ideas of victimhood and persecution by “the elite”[23] and a paranoia[24] that “they” are trying to stop “us” from discovering the truth is important is increased censorship attempts further give fuel to this fire. As social media platforms continue to proliferate and mushroom, questionable content will always be able to find a home.

 

V. CONCLUSIONS ON THE DIGITAL SERVICES ACT: THE PARADOX OF REGULATION

Social media regulation is complex and problematic, but it is also difficult to imagine a situation in today’s digital world where social media is unregulated. However, it is much easier for regulators to make promises than to either operationalist these or to gain compliance from large multinational companies. Also, Macron’s push for more legislation at the European level after similar rules were defeated in France demonstrate a problematic aspect of Europeanisation and the democratic deficit where the commission is deciding how a member state should manage digital free speech by going over the head of the member states supreme court. Additionally content moderation has become an ever more contentious.


[1] Senior Lecturer in International Relations and Politics, Aston University.

[2] Bradford Betz, “Elon Musk Teases Twitter Files on Free Speech Suppression: ‘Public Deserves to Know,'” FOXBusiness (Fox Business, 2022), https://www.foxbusiness.com/politics/elon-musk-teases-twitter-files-free-speech-suppression-public-deserves-know.

[3] Nesrine Malik, “Elon Musk’s Twitter Is Fast Proving That Free Speech at All Costs Is a Dangerous Fantasy,” The Guardian, November 28, 2022, https://www.theguardian.com/commentisfree/2022/nov/28/elon-musk-twitter-free-speech-donald-trump-kanye-west.

[4] How Jokes Won the Election, The New Yorker, January 23, 2017, [Online] Available at, ed. by E. Nussbaum, 2017; Zeynep Tufekci, “Opinion | YouTube, the Great Radicalizer – The New York Times,” The New York Times, 2018, 5.

[5] Wasim Ahmed and others, “COVID-19 and the 5G Conspiracy Theory: Social Network Analysis of Twitter Data,” Journal of Medical Internet Research, 22.5 (2020), e19458, https://doi.org/10.2196/19458.

[6] European Commission, “The Digital Services Act Package | Shaping Europe’s Digital Future” https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.

[7] European Commission, “The Digital Services Act Package | Shaping Europe’s Digital Future.”

[8] European Commission, “The Digital Services Act Package | Shaping Europe’s Digital Future.”

[9] European Commission, “The Digital Services Act Package | Shaping Europe’s Digital Future.”

[10] EFF, “Victory! French High Court Rules That Most of Hate Speech Bill Would Undermine Free Expression,” Electronic Frontier Foundation, 2020, https://www.eff.org/press/releases/victory-french-high-court-rules-most-hate-speech-bill-would-undermine-free-expression.  

[11] Laura Kayali, “Macron Goes after Online Platforms, Foreign ‘Propaganda’ Media,” POLITICO, 2022 https://www.politico.eu/article/emmanuel-macron-online-platforms-foreign-propaganda-media.

[12] European Commission, “Applying EU Law,” European Commission – European Commission https://ec.europa.eu/info/law/law-making-process/applying-eu-law_en.

[13] European Commission, “Questions and Answers: Digital Services Act,” European Commission – European Commission https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348, accessed 29 November 2022.

[14] Marietje Schaake & Rob Reich, “Election 2020: Content Moderation and Accountability,” 6.

[15] European Commission, supra note 13.

[16] Jonathan Crossfield, “The Hidden Consequences of Moderating Social Media’s Dark Side,” Content Marketing Institute, 2019 https://contentmarketinginstitute.com/cco-digital/july-2019/social-media-moderators-stress, accessed November 29, 2022.

[17] Natalie Alana Ashton & Rowan Cruft, “Social Media Regulation: Why We Must Ensure It Is Democratic and Inclusive,” The Conversation, 2022 http://theconversation.com/social-media-regulation-why-we-must-ensure-it-is-democratic-and-inclusive-179819, accessed November 22, 2022.

[18] Schaake & Reich, supra note 14.

[19] European Commission, supra note 13.

[20] Sang Ah Kim, “Social Media Algorithms: Why You See What You See,” Georgetown Law Technology Review, 2017 https://georgetownlawtechreview.org/social-media-algorithms-why-you-see-what-you-see/GLTR-12-2017.

[21] Mitch Prothero, “ISIS Supporters Secretly Staged a Mass Migration from Messaging App Telegram to a Little-Known Russian Platform after the London Bridge Attack,” Insider, 2019, https://www.insider.com/isis-sympathisers-telegram-tamtam-london-bridge-2019-12.

[22] Richard Rogers, “Deplatforming: Following Extreme Internet Celebrities to Telegram and Alternative Social Media,” European Journal of Communication, 35.3 (2020), 213–29, https://doi.org/10.1177/0267323120922066.

[23] Conspiracy Theories and the People Who Believe Them, ed. by Joseph E. Uscinski (New York, NY: Oxford University Press, 2018).

[24] J. Eric Oliver & Thomas J. Wood, “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” American Journal of Political Science, 58.4 (2014), 952–66.