Various regulatory attempts at taming global Internet platforms have entered the stage worldwide. These set out to renegotiate the cornerstones of a workable social contract and the expectations of the various participants in terms of social roles, acceptable behavior and reasonable means. In this vein, the European Digital Services Act (“DSA”) takes on questions of platforms’ responsibility for content moderation with an asymmetric system of due diligence obligations. This comprises the assessment of systemic risks that may arise from platform services, which includes risks resulting from the dissemination of illegal content or those that negatively affect the exercise of fundamental rights such as the freedom of expression and information. The fact that the responsibility for this systemic risk assessment and the deployment of mitigation measures against these risks rest primarily with very large online platforms (“VLOPs”) and their interpretive sovereignty raises various concerns. A major question is what cultural imprint this will inflict on fundamental rights in Europe and what normative values will eventually be accentuated.

By Natascha Just[1]

 

I. Introduction

When William Shakespeare wrote his comedy “The Taming of the Shrew” in the late 16th century the term and the social role of the shrew were apparently set, as were the means to achieve what was expected from them. Etymologically, the shrew – a small insectivorous, mouse-like mammal with at that time a supposedly venomous nature – first became associated with spiteful people more generally to then stand only for unpleasant, ill-tempered and maladjusted women. The latter’s ideal social role was to be meek and submissive and the male means to achieve this thuggish. But Shakespeare’s play is also about the problem of illusion and reality, about false and true identities. This theme is set in the framing introduction in which an illustrious hunting party gets a drunk tinker to wake up believing he is a lord. The wife they put beside him is a man in disguise and the traveling players who eventually stage the main play are instructed to ignore his odd behavior, which does not befit his supposed rank.

Definitions and identities, social roles and norms, behavior, expectations and means – these are all important but similarly contested and rapidly changing facets in the governance of globally active Internet platforms that are currently facing tough negotiations worldwide. Are they tech or media companies? Should they assume sovereign tasks? Are they the only shrews here? Barely a quarter century has passed since John Perry Barlow’s “A Declaration of the Independence of Cyberspace” in 1996.[2] This called on governments to stay away from it, declaring that cyberspace problems would be solved by its own social contract, and anyone would be able to enter without privilege or prejudice and to express beliefs without fear of being coerced into silence.

In the meantime, a handful of mostly U.S. companies with a corresponding cultural imprint have structurally transformed our societal communications system. Alphabet, with its core moneymaker Google, Meta with Facebook and Instagram, or Twitter and the like are now taking over, complementing and changing social functions that were traditionally held by national media and communications companies. At the same time, they dwarf any of these traditional companies with their economic power and play a specific role in the constructions of our realities with their intermediary gatekeeping powers[3] and by employing sophisticated, opaque automated algorithmic-selection services.[4] The large market shares and business practices of these platforms have increasingly become a cause for concern, as have their strategic role and influence regarding access to and curation of content.

This rise to power and the specific roles of Internet platforms within and for our public communications system increasingly direct attention to some more dysfunctional elements that have become evident, such as the dissemination of illegal content, disinformation or hate speech as well as chilling effects or discrimination. Events such as the U.S. Capitol riots in early 2021, which were incited by then-President Donald Trump’s tweets alleging vote fraud, mark a sad turning point, directing attention away from the Internet’s democracy-enhancing potential to its potentially democracy-endangering force. The early libertarian euphoria has given way to a kind of disillusionment, and various regulatory attempts at taming have entered the stage worldwide. These set out to renegotiate the cornerstones of a workable social contract and the expectations placed on the various participants in terms of social roles to be filled, acceptable behavior and reasonable means.

One of the more recent European attempts at this is the Digital Services Act (“DSA”),[5] which was proposed by the European Commission alongside the Digital Markets Act (“DMA”)[6] in mid-December 2020. The latter is an ex ante regulation to assure fair and contestable markets and specifically targets large gatekeepers of core platform services such as online search engines and online social networking services.[7] The former aims to contribute to a safe, predictable and trusted online environment, where fundamental rights are effectively protected.[8] Among other things, it takes on questions of platforms’ responsibility for content moderation with a system that leaves a surprisingly broad scope for discretion to very large online platforms (“VLOPs”) in what are socially particularly sensitive areas where fundamental rights such as free speech are at stake.

The relevant European institutions agreed on the DSA this late April, but the final text is not yet publicly available. The following comments are therefore based on the December 2020 proposal of the European Commission and the amendments agreed by the EU Parliament in its first reading in January 2022.[9]

 

II. The European Digital Services Act

Over the last twenty years, Internet platforms have been regulated in Europe by the Directive on electronic commerce of 2000 (thereafter the e-Commerce Directive),[10] which introduced liability privileges for content hosted by them. Accordingly, Internet platforms are not legally responsible for illegal content they host but required to remove or to disable access to it once they know of it. While the European liability provisions were originally inspired by the 1998 U.S. Digital Millennium Copyright Act and its rules relating to copyright infringements,[11] they also have parallels with Section 230 of the 1996 U.S. Communications Decency Act.[12] This protects providers from liability on the grounds that they are not to be treated as the publisher or speaker of any information provided by another content provider. In addition, they are also protected, provided they act in good faith and restrict access to or availability of content that the provider or users deem objectionable.

The European and the U.S. provisions were both introduced at a time when the Internet landscape was completely different. In the light of recent platform power and attendant dysfunctions, discussions about the reasonableness and fairness of these rules and the extent of relief for Internet platforms from liability have moved center stage. Reforms are being suggested,[13] have already been implemented (e.g. the contentious Art. 17 of the European Copyright Directive, which governs the use of protected content by online content-sharing service providers),[14] or have been agreed upon as in the case of the DSA.

The DSA is a Regulation, which, unlike the instrument of a Directive, which requires transposition into national law, will directly be applicable in all EU members when it enters into force. The choice of Regulation is deliberate and intended to counter legal fragmentation in the European internal market that may arise upon transposition. Further, it aims at curbing solo national efforts such as recent laws that tackle content moderation and complaints regarding illegal online content, for example the KoPl-G in Austria (in force since 2021)[15] and the NetzDG in Germany (in force since 2017).[16] The DSA applies to all providers irrespective of their place of establishment, providing they offer services to recipients in the European Union, and further amends the e-Commerce Directive, from which it transfers the liability regime for Internet platforms with some additions.

A. Liability Regime

The liability regime (Chapter II) and the due diligence obligations (Chapter III) are the linchpin for dealing with illegal content and therefore an important cornerstone of the DSA. It principally maintains the liability exemptions of the e‑Commerce Directive, which, as further elaborated in case law of the Court of Justice of the European Union (“CJEU”), offer exemptions for passive or neutral providers of intermediary services.[17] There is no liability in cases of “mere conduit” or “caching” when the providers of such service assume no active role in the transmission of content. In cases of “hosting,” they are excluded from liability if the content is not provided under their control or authority, they have no knowledge of the illegal content or activity and expeditiously remove or disable access to it once they become aware of it. Further, providers do not have any general monitoring or active fact-finding obligations.

Novel to the DSA is a Good Samaritan clause similar to Section 230. Accordingly, providers will not forfeit their liability exemptions if they voluntarily carry out activities to detect, identify and remove illegal content in a diligent manner and in good faith. In this case especially, the distinction between the active and passive role of the provider as elaborated by the CJEU may be put to test and prove difficult in practice. In addition, there are new rules that indicate how providers must react when they receive an order from a national judicial or administrative authority informing them about illegal content or requesting information on a specific user and what that order must contain. Among other things, providers have to explain how they have complied with the order and when. Further, the authorities’ have to explain why a specific content is illegal or required, to indicate the exact URL or other information to enable its identification as well as the territorial scope of the order and redress options.

B. Due Diligence Obligations

The DSA introduces a four-layered asymmetric system of due diligence obligations for Internet platforms and includes, among other things, notice-and-action mechanisms for illegal content, the possibility to challenge a platforms’ content-moderation decision, and the obligation to conduct assessments of systemic risk. The precise obligations depend on the role, size and impact of the provider and are cumulatively applied to intermediary services, hosting providers, online platforms, and VLOPs.[18] This new scheme can be visualized either as a four-layered pyramid, where the bottom layer (i.e. intermediary services) has the least obligations and the apex (i.e. VLOPs) the most, or as a concentric nested layered system, where the outer layer has the least and the innermost the most.

The following briefly summarizes some of the due diligence obligations related to content moderation and then focuses specifically on the risk assessments that VLOPs must conduct. These touch upon very sensitive areas of fundamental rights and thus raise the question of whether Internet platforms should be the ones in charge of assessing the systemic risks their services pose and of devising the appropriate mitigation measures themselves.

  1. Intermediary Services

In their terms of service all providers are required to inform publicly and unambiguously about their content-moderation policies and procedures, including algorithmic decision-making and human review, and their activities have to respect the fundamental rights of the recipients as enshrined in the Charter of the European Union.[19] In addition, they all have to publish transparency reports on content moderation, e.g. information on their own-initiative content moderation, the number of complaints received through their internal complaint-handling system, together with the types of alleged illegal content and the time needed for taking decisions.

  1. Hosting Services and Online Platforms

Providers of hosting services, including online platforms, have to further install notice-and-action mechanisms that permit easy notification of illegal content, including possibilities to submit all necessary information to identify the illegality of content (e.g. explanations of illegality, URL, name of submitter). Providers must confirm receipt of the notice and inform about their decision in a timely, diligent and objective manner. This also includes obligations to provide the user who provided the content with detailed statements of the reason for its removal (e.g. alleged illegality or incompatibility with terms of service, including reference to the contractual grounds and information whether the decision was reached by automated means), and the requirement to publish decisions and statements in a publicly available database.

  1. Online Platforms

In addition to all the above, online platforms are required to install user-friendly, easy to access and free internal complaint-handling systems that allow for complaints in cases of content removal or suspension and termination of service. Further, users have the right to resort to impartial certified out-of-court bodies to settle disputes relating to platforms’ content-moderation decisions. Online platforms must cooperate with these bodies and carry the cost of resolution. Moreover, they are obliged to process and decide on notices of certified trusted flaggers with priority and without delay. Trusted flaggers are impartial entities with proven experience in the realm of illegal content who represent a collective interest. The status is awarded upon application by the Digital Services Coordinators of establishment, which are the primary national authorities designated by the member states for the consistent application of the DSA. There is also a duty to notify suspicions of criminal offences and various protections against misuse. Online platforms, for example, are to suspend users who frequently provide illegal content, or pause the handling of complaints in cases of frequent unfounded notices. There are additional transparency-reporting obligations, among other things, on the number of disputes submitted to out-of-court bodies for settlement, the number of suspensions imposed and the use of automated content moderation.

  1. Very Large Online Platforms

VLOPs, which serve on average at least 45 million monthly active users in the EU, are finally the category that is subjected to all of the above plus additional obligations due to their specific systemic role in facilitating public debate and economic transactions and the attendant highest level of risk to society that may stem from their activities. Accordingly, they are obligated to identify, analyze and assess significant systemic risks. These include those resulting from the dissemination of illegal content, those that negatively affect the exercise of fundamental rights such as the freedom of expression and information, or entail the intentional manipulation of services, also by inauthentic use or automated exploitation of the service, with actual or foreseeable effects, among other things on civic discourse, electoral processes and public security. For this assessment they are particularly required to consider how their content-moderation practices as well as their recommender and ad-display systems affect systemic risks.

The design of the risk-mitigation measures rests with VLOPs too, and may involve adaptations to their content-moderation or recommender systems, restrictions on advertising, cooperation with trusted flaggers or the establishment and adjustment of codes of conduct. In turn, the board – an independent advisory group of the Digital Services Coordinators – and the European Commission publish an annual report on the most prominent and recurring systemic risks as reported by the VLOPs, including best practices to mitigate such risks. In addition, at their own cost VLOPs are subject to annual independent audits to assess compliance with the DSA and may have to provide access to data to vetted researchers for investigations that contribute to the identification and understanding of systemic risks. There are further transparency obligations, for example regarding the mode of operation of recommender systems, including options for users to modify or influence the relevant parameters regarding the order of information presented or profiling. Moreover, they are required to set up a publicly available repository on the advertising, comprising information on its content, sponsor, reach and whether and to what extent specific targeting of users was involved. The European Commission further reserves an enhanced right to supervise, investigate, and monitor VLOPs and enforce the DSA, thus adding a further layer to an already scattered oversight and enforcement system.

The DSA exhibits a particularly conspicuous accentuation of fundamental rights, which was partly strengthened by the suggested amendments from the European Parliament in its first reading, expanding it by including further articles of the EU Charter and for example stressing the freedom of the media and pluralism, the protection of personal data, human dignity, or effects on democratic values more generally.[20]

Despite comprehensive measures to assure the compliance by VLOPs through independent audits, enhanced supervision by the European Commission or sanctions, the fact that the responsibility for systemic risk assessments and the deployment of mitigation measures against these risks rest primarily with VLOPs and their interpretive sovereignty raises various concerns. A major question is what cultural imprint this will inflict on fundamental rights in Europe and what normative values will eventually be accentuated.

 

III. What Values for Fundamental Rights in Europe?

Fundamental rights quite often appear to be set in stone in western democracies and the colloquial understanding of their substance is almost monolithic. But a closer look reveals considerable differences in the way they are interpreted by courts and people, and public ignorance of what is protected and who they protect against. This fact is particularly important when platforms socialized within an U.S. speech-protection environment are entrusted with risk assessments regarding these fundamental rights and other central democratic functions in other jurisdictions. Most recently, Elon Musk’s deal to buy Twitter, which is on hold again, is a good example of conflicts that may arise in the interpretation of free speech. A 2018 survey by the Freedom Forum Institute on the state of the First Amendment found that 77 percent of U.S. citizens are supportive of it and the freedoms it protects but two-fifths of them (40 percent) could not name a single freedom it guaranteed and another third (36 percent) could only name one.[21]

While freedom of speech was the most-commonly recalled right (56 percent),[22] this freedom has also been identified as a constitutional challenge given the new ways of expressing opinions, constituting publics and public space or censoring and moderating speech online.[23] Communication rights, such as the freedom of speech and expression or the freedom to receive and disseminate information are rights where a cultural imprint is particularly accentuated and there are significant differences between the U.S. and Europe. In the U.S., free speech is an almost absolute right with only a few restraints, while in Europe – which may not be considered as a monolithic bloc either – freedom of expression does not trump all other rights, is not necessarily granted a preferred position and needs to be balanced against other competing rights, for example rights to privacy. As research shows, the reasons why and how platforms moderate content is also related to the underlying free-speech norms.[24] Of the various interpretations of the First Amendment by the U.S. Supreme Court, it is probably the marketplace-of-ideas theory that has most support.[25] This posits the discovery of truth through the competition of contrasting ideas, which also include ideas that may not be legal in European contexts, such as certain hate speech or holocaust denial.

In addition, the traditional function of fundamental rights is a defense against the state and its actions and powers inhibiting these rights. Thus they do not directly protect against private companies or community censorship – two instances that currently pose a great risk to freedom of expression, together with new techniques of speech control that lie in the design of the network infrastructure itself or the operations at work in its applications such as search engines or recommender systems.[26] Altogether, this calls for a rethinking of how to protect freedom of expression online and how and whether private entities should be directly or indirectly committed to these rights.

In addition, communication rights fulfil individual and societal functions that need to be balanced. The individual dimension is essentially aimed at personal self-realization regardless of benefits to society at large, while the social dimension sees communication rights as instruments for the protection of democracy. Communication rights thus guarantee further fundamental rights and aid in the control of political power, will formation and the free exercise of political rights. It is these social functions and values that need to be accentuated in an environment where platforms increasingly orient their content moderation to individuals with the aim of maximizing private economic advantages, and where the right of the speaker is considered more important than the disadvantage to those who have to listen and face the potentially negative consequences that may arise from it.

Altogether, the DSA is a further step in tackling definitions and identities, social roles, norms, behavior, expectations and means in the governance of Internet platforms. There are high hopes that it will remedy many of the visible dysfunctions, but in the end the extent to which it will indeed contribute to taming Internet platforms, and – indirectly – all Internet users’ socially detrimental behavior, remains to be seen.


[1] Professor of communication at the IKMZ – Department of Communication and Media Research of the University of Zurich. Chair of the Media & Internet Governance Division.

[2] John P. Barlow, “A Declaration of the Independence of Cyberspace,” 1996, https://www.eff.org/de/cyberspace-independence.  

[3] Natascha Just and Michael Latzer, “Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet,” Media, Culture & Society 39, no. 2 (2017): 238–58, https://doi.org/10.1177/0163443716643157.

[4] Michael Latzer et al., “The Economics of Algorithmic Selection on the Internet,” in Handbook on the Economics of the Internet, ed. Johannes M. Bauer and Michael Latzer (Cheltenham, Northhampton: Edward Elgar, 2016), 395–425.

[5] European Commission, “Proposal for a Regulation of the European Parliament and the Council on a Single Market For Digital Services (Digital Services Act) and Amending Directive 2000/31/EC, COM(2020) 825 Final,” 2020.

[6] European Commission, “Proposal for a Regulation of the European Parliament and the Council on Contestable and Fair Markets in the Digital Sector (Digital Markets Act), COM(2020) 842 Final,” 2020.

[7] For an assessment see Prabhat Agarwal, “The EU’s Proposal for a Digital Markets Act – an Ex Ante Landmark,” TechREG Chronicle, January (2022): 9–15.

[8] DSA proposal, art. 1 para 2.

[9] European Parliament, “Amendments Adopted by the European Parliament on 20 January 2022 on the Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and Amending Directive 2000/31/EC (COM(2020)0825 – C9-0418/2020 – 2020/0361(COD)), P9_TA(2022)0014,” 2022.

[10] European Parliament and Council of the European Union, “Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on Certain Legal Aspects of Information Society Services, in Particular Electronic Commerce, in the Internal Market (‘Directive on Electronic Commerce’),” 2000.

[11] 17 USC Section 512 (c).

[12] 47 U.S.C. § 230.

[13] For debates on the reform of Section 230 see, for example Danielle Keats Citron and Mary Anne Franks, “The Internet as a Speech Machine and Other Myths Confounding Section 230 Reform,” The University of Chicago Legal Forum, 2020, 45–76; Jeff Kosseff, “A User’s Guide to Section 230, and a Legislator’s Guide to Amending It (or Not),” 2021, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3905347.

[14] European Parliament and Council of the European Union, “Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on Copyright and Related Rights in the Digital Single Market and Amending Directives 96/9/EC and 2001/29/EC,” 2019.

[15] “Bundesgesetz über Maßnahmen zum Schutz der Nutzer auf Kommunikationsplattformen (Kommunikationsplattformen-Gesetz – KoPl-G) (Federal Act on Measures for the Protection of Users on Communication Platforms (Communications Platforms Act – KoPl-G)) StF: BGBl. I Nr. 151/2020 (NR: GP XXVII RV 463 AB 509 S. 69. BR: 10457 AB 10486 S. 917.),” 2020.

[16] “Gesetz Zur Verbesserung Der Rechtsdurchsetzung in Sozialen Netzwerken (Netzwerkdurchsetzungsgesetz – NetzDG) (Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act)) of September 1, 2017 (BGBl. I p. 3352), as Last Amended by Article 1 of the Act to Amend the Network Enforcement Act of June 3, 2021 (BGBl. I p. 1436),” 2017.

[17] According to the DSA, intermediary services include three categories: (1) mere conduit services, which transmit information of a recipient or provide access to a communication network, (2) caching services, which transmit information and involve its automatic or temporary storing for the sole purpose of more efficient onward transmission, and (3) hosting services, which store the information provided by and at the request of a recipient. The latter comprises online platforms and very large online platforms that store and disseminate information to the public at the request of a recipient of the service.

[18] See also supra note 17.

[19] European Union, Charter of Fundamental Rights of the European Union, Official Journal of the European Union C83, vol. 53 (Brussels: European Union, 2010).

[20] Supra note 9. As noted, the final agreed on version of the DSA is not public yet and the suggested amendments may or may not have been considered.

[21] Freedom Forum Institute, “The 2018 State of the First Amendment,” 2018, https://www.freedomforuminstitute.org/wp-content/uploads/2018/06/2018_FFI_SOFA_Report.pdf, at 3.

[22] Ibid.

[23] Gero Kellermann, “Die Meinungsfreiheit als verfassungspolitische Herausforderung (Freedom of Expression as a Constitutional Challenge),” Datenschutz und Datensicherheit – DuD 45, no. 6 (2021): 363–67.

[24] Kate Klonick, “The New Governors: The People, Rules and Processes Governing Online Speech,” Harvard Law Review 131 (2018): 1598–1670.

[25] Clay Calvert, Dan V. Kozlowski, and Derigan Silver, Mass Media Law, 20th ed. (Mc Graw Hill, 2018).

[26] Jack M. Balkin, “Old-School/New-School Speech Regulation,” Harvard Law Review 127 (2014): 2296–2342; Wolfgang Hoffmann-Riem, Recht Im Sog der Digitalen Transformation (Law in the Wake of Digital Transformation) (Tübingen: Mohr Siebeck, 2022); Matthew P Hooker, “Censorship, Free Speech & Facebook: Applying the First Amendment to Social Media Platforms Via The Public Function Exception,” Washington Journal of Law, Technology and Arts 1, no. 15 (2019): 36–73; Simon Jobst, “Konsequenzen einer unmittelbaren Grundrechtsbindung Privater (Consequences of a Direct Commitment of Private Parties to Fundamental Rights),” NJW – Neue Juristische Wochenschrift, 2020, 11–16; Lawrence Lessig, Code and Other Laws of Cyberspace (New York: Basic Books, 1999); Joel R. Reidenberg, “Lex Informatica: The Formulation of Information Policy Rules through Technology,” Texas Law Review 76, no. 3 (1998): 553–93; Tim Wu, “Is the First Amendment Obsolete?,” Michigan Law Review 3, no. 117 (2018): 547–81.