In this contribution we look at the upcoming changes in the EU platform regulation. More specifically, we focus on the Digital Services Act (“DSA”) from the competition perspective. The DSA is less frequently discussed from this perspective compared to its companion regulation: the Digital Markets Act (“DMA”), which explicitly aims to increase contestability of digital markets. We argue that the DSA, via the modification of liability rules for the platforms, may also bring competitive effects to the platform economy. To set the scene we discuss why an update of liability regime was necessary in the first place. Then we conjecture how platforms may adapt to the new rules and argue that more content screening can be expected. Finally, we hypothesize how the DSA may affect competition between large and small platforms via changes in content curation behavior. We sketch conditions under which the existing differences in size between the platforms could decrease leading to a more balanced market landscape.

By Maciej Sobolewski & Néstor Duch-Brown[1]

 

I. INTRODUCTION

The past 20 years or so have witnessed the rapid development of novel digital services, based on the notion of the social web or Internet 2.0. This innovation in the way digital content was generated and consumed has provided abundant value to consumers and at the same time has allowed the emergence of new firms and business models that have completely changed the competitive landscape of digital markets. However, the legal and regulatory framework in which these developments took Place was designed still having in mind the characteristics of the previous phase of the development of the internet. In that period, content was mostly consumed in a passive manner from static websites offered by a relatively small number of content producers. Although there was misinformation and illegal activities also at that time, technology was a rather small industry, and it did not affect most people’s lives in a significant way. Today, when software and algorithms have become mainstream, all the problems we face as a society have a manifestation based in software and algorithms as well. As a greater proportion of individuals adopt digital solutions and start using them regularly, the online world replicates the good and bad things that happen in the offline world. However, in the online dimension the problems are amplified not only because they involve a lot more people, but because they combine and feed each other and generate new externalities. In a novel online setting with emerging new actors in economic and social activities, there is a need to rethink the rules of the game.

In what follows, we focus on these new rules included in the Digital Services Act (“DSA”). However, instead of taking a fundamental rights approach, we will look at it from a competition perspective. Since the DSA has been less frequently approached from this perspective compared to the Digital Markets Act (“DMA”) – aiming at increasing contestability of digital markets-, we think we offer a somewhat novel perspective. We argue that the DSA, via the modification of liability rules for platforms, may also bring competitive effects to the platform economy. We structure our thinking as follows. First, we discuss why an update of liability regime was necessary in the first place. Second, we sketch some mechanisms that explain how platforms may adapt to the new rules and we argue that more content screening can be expected. Third, we hypothesize how the DSA may affect competition between large and small platforms via changes in content curation behavior. We delineate some scenarios under which the existing differences in size between the platforms could decrease leading to a more balanced market landscape.

 

II. WHY DO WE NEED THE DIGITAL SERVICES ACT?

The current legal framework for online activities was set out in the Electronic Commerce Directive (“ECD”) more than twenty years ago when the Internet ecosystem was in still in a nascent phase. Over these two decades, the types of online services have evolved substantially, and so has the scale of their use. The role of providers changed from the provision of mere conduit to the creation of services based on data while leveraging positive externalities among users. Finally, a new type of private enterprises acting as online intermediaries on multisided markets emerged on the digital scene.

These platforms orchestrate interactions among various types of participating users. Because of their huge success in facilitating online transactions and exchanges of user generated content of all sorts, these online platforms quickly expanded to complex and powerful ecosystems. These ecosystems have now a systemic impact on the economy and society, occurring in both intended and unintended ways. For example, recent research extensively discusses the side-effects of the widespread use of recommender algorithms by social media on contagious spread of propaganda and fake news. On the other hand, the Facebook-Cambridge Analytica scandal demonstrated how user data can be abused for psychological targeting or worse, manipulation of political preferences according to a hidden private agenda. To address these systemic challenges and ensure better protection of users and their fundamental rights in the rapidly growing digital space, the European Commission decided that the legal framework for online activities needed a modernization. The DSA introduces updated harmonized liability rules for all providers of digital services on the Digital Single Market. Additional measures are also imposed on very large online platforms (reaching more than 45 million users in the European Economic Area) of various types: search engines, marketplaces and social networks, in recognition of their pivotal role for the mitigation of systemic risks, such as manipulation of elections, censorship, spread of disinformation, illegal hate speech, cyber violence or harm to minors.

The ECD liability regime was established in 2000, when major digital services like social media and big online marketplaces were yet non-existent. Without an exemption from primary liability for service providers, the online services as we know them today would not have developed because of litigation costs. In the ECD, conditions for liability exemption are linked to the so-called knowledge-standard. They apply mostly to providers who host content uploaded by third parties. A platform hosting particular item like pirated movie, or a racist post will not be held liable as long as it is not aware of its illegal nature. Once the platform learns about a concrete infringing item, it has to block it in order to maintain liability exemption. This action has to be expeditious and preceded by an appropriate evaluation. A platform may enter into possession of a “red flag knowledge” in two ways. It may discover the infringing item via own screening procedure such as filtering or automated content moderation or by receiving a notification from a third party that located the item on particular account administered by the platform. While the above rules are logically consistent, it is not hard to see why they may not be fit for purpose when user-generated content is being uploaded at a scale of billions items every hour. There is a legitimate concern that hosting services would choose to limit the inflow of the red flag knowledge from third parties rather than engage in costly handling of infringing items. This dysfunctional outcome could be easily accomplished with small modifications of user interface that deteriorate user-friendliness of reporting process. Against this opaque incentive that leads to less illegal content being blocked, the DSA pushes for greater  empowerment of the third parties coupled with more active engagement in content management by the platforms, both leading ultimately to  higher suppression of illegal items.

Importantly, the new regulation does not force the platforms to engage in moderation of all uploaded content items nor imposes any technical solutions with regard to content curation. Such obligation would quickly generate a prohibitive economic burden on smaller online providers who experience rapid growth in content volumes. Indeed, content moderation requires a great deal of financial resources, skills and labor. Automated moderation based on machine learning algorithms does not guarantee perfect accuracy in detecting truly infringing items. Despite the overall technical progress over the past years, misclassification rates are often high and there are no magical shortcuts. For example, an increasing proportion of true negatives always comes at the costs of rejecting more legitimate items, which leads to undesirable over-moderation. This shows that human judgement still is crucial in the process and will remain so in the near future. Human moderation can be from 5 to 20 times more expensive than AI-based moderation depending on the type of content and wages on the local labor markets. This makes the entire business process not scalable. Human moderators work usually only in “grey zone” cases, those that require advanced contextual judgement. Largest online platforms contract several thousands of moderators and their total wage bill for content moderation is counted in hundreds of millions of dollars annually.

To achieve its main goal of ensuring better protection to users and to fundamental rights online, the DSA introduces a seemingly minor modification to the conditions required for the safe harbor. Yet this change has far-reaching consequences for the behavior of the platforms. In order to maintain liability exemption all online platforms must implement a new, user-friendly notice and action procedure that simplifies the notification of specific items considered to be illegal by the notifying parties. In practical terms, this procedure facilitates submission of notices about potentially harmful elements by third parties, in particular private persons, copyright holders and rights enforcement organizations who have legitimate interest in screening content. If the platform agrees with the assessment of the notifying party, it has to swiftly remove or disable access to that content. Additionally, the platform is obliged to instate an efficient complaint and redress mechanism and to allow trusted flaggers who may place notifications on a mass scale. Similarly to the ECD, the DSA presumes that a platform acquires a “red flag knowledge” about a particular infringing element upon receiving a valid notice, which includes information on the internet location of that element. Easily accessible notifications guarantee that avoiding a “red flag knowledge” will be practically impossible.

Submitted notices are quite costly to handle, as typically they will require human evaluation and processing. This is why the DSA, while presuming diligence and good faith of all parties, contains also safeguards against placing unfounded notices on a mass scale that abuse the notice and action mechanism. If a platform decides to reject the notice, it has to provide a written justification which may be contested by the affected user, possibly escalating to out-of-the-court dispute settlement level. By increasing the ease of submitting notices, the DSA provides additional economic incentives for the platforms to engage, at least partially, in own ex ante content screening to reduce the number of legitimate notices to deal with. This outcome can be achieved with hash-based filtering, which compares newly uploaded content against already blacklisted items and also ex ante automated moderation approach. It is important to note that the business process leveraging the abovementioned technologies can either be developed in-house or outsourced to third-party providers offering content moderation in a software-as-a-service mode. The choice between both options is determined by platform scale. For sufficiently large content volumes, own custom-made solution will be more cost effective per item than the unit price of a third-party solution, although it requires substantial upfront investment.  

 

III. HOW WILL PLATFORMS REACT TO THE DSA?

In the previous section we argued that the updated liability rules, and most notably the notice-and-action procedure, may push platforms towards more intensive content screening to avoid overflow of notices. Additional ex ante screening efforts and scrutinizing items flagged in the notices will result in higher curation of user-generated content and less counterfeited products available online. Better quality of content-based services will likely increase satisfaction of various user groups on a platform. However, users will also face a price to pay for the efforts undertaken by the platform operator. 

As any profit maximizing entity, a platform will react to the increased volume of notices and additional screening effort by increasing the price for its services. In this way, a platform will try to shift the increased cost of content curation on one or more groups of users. This pass-through effect may take different forms in practice, depending on the type of a platform and adopted business model. For example, a social network could widen the scope of data requested from users in exchange of the service or increase their exposure to ads. Instead of rising the implicit price denominated in data, a social network could also increase monetary fees for advertisers. Similarly, a marketplace operator might lift transaction fees for business users to recover part of the costs related to tracking counterfeited goods. Economic models of multi-sided markets suggest that in order to absorb a cost increase, a monopoly or a dominant platform will exploit in the first place the group of users with less elastic demand. Typically this will be advertisers or business users, who are less likely to quit due to limited substitutability of their target audiences. Monetization of data is often combined with service innovation to derive more value from economies of scope in data aggregation. For example, a platform that has widen scope of collected data may expand to adjacent markets in order to add complementary services to its core offering. Such ecosystem expansion strategy will reduce the negative effects of price adjustment on current users.

Intuitively, a pass-through effect on users will be determined by several factors, such as (i) adaptation costs for the platform related to additional content screening triggered by the DSA; (ii) users’ taste for quality of content and (iii) privacy preservation; and (iv) proportion of captive users in the total user base of a platform. Contrary to contestable users, captive users are loyal and thus can be easily exploited by the platform. The pass-through will also depend on the degree of horizontal differentiation between competing platforms, which determines the competition effects on the contestable segment. It can be expected that, ceteris paribus, larger platforms will be able to pass a greater proportion of costs on users than smaller platforms. This is caused by the difference in network externalities that favors a larger platform. On the other hand, larger platforms may not necessarily bear a higher level of adaptation costs induced by the DSA due to the two opposing effects at play. The first effect is positive for big platforms and relates to economies of scale from in-house content moderation. Bigger platform have access to the better AI skills, larger training datasets and cheaper storage and computing power, which all provide for higher detection precision in comparison to software-as-a-service external solutions. Consequently larger platforms will enjoy lower per item cost of automated moderation. The second effect is negative and related to a greater content scrutiny by trusted flaggers. Intuitively, the attention of trusted flaggers, copyright owners and other monitoring organizations will naturally be focused on dominant platforms where harm from illegal content is amplified because of large network externalities. Consequently, a bigger platform will receive more notices to handle diligently in order to preserve liability exemption. It is impossible to say which of the two effects prevails a priori, especially because large and small platforms may differ in other relevant factors, such as audience profile, organic rate of content toxicity, moderation technology used to date, which also determine the level of adaptation costs to the DSA.

 

IV. WHAT ARE POSSIBLE COMPETITION EFFECTS OF THE DSA?

Based on the previous considerations, we argue that the DSA will likely result in more intensive screening and curation of content on the platforms side, leading to higher costs of service provision. The magnitude of the cost increase will vary across platforms in a complex way. As discussed above, the per item adaptation costs may not necessarily be higher for big platforms, although most likely they will be able to shift a greater proportion of this cost to users. For these reasons, various outcomes with regards competition effects of the DSA may materialize.

In general terms, competition between platforms will be stronger, the larger the segment of contestable users and the less differentiated the service. However, bigger platforms also enjoy an incumbency advantage, stemming from direct and indirect network externalities. This “bigness” advantage translates into more loyal (captive) consumers on average, which cannot easily be captured by other platforms via higher content quality or lower price. The big platform will need to balance the opposing incentives to exploit its captive users while competing with other platforms for contestable consumers. Additional screening effort enables platforms to leave more utility to users from enjoying less toxic environment. On the contestable part of the market, this additional utility will attract new users. This indirect positive competition effect reduces a pressure on platforms to increase the price. On the other hand, a platform faces an increase of its marginal cost of serving users, which it will try to shift onto users via increased price (monetary or implicit). The aforementioned effects have opposite signs, but when additional screening costs are high, the pass-through effect is likely to outweigh competition effect. In such case, a platforms will react to the DSA by increasing prices more -ceteris paribus- than in the case of low adaptation costs. On the other hand, if the DSA adaptation cost is small, the competition effect will prevail, and the platform could lower its price to attract more customers.

Building on the above considerations, there are four qualitatively different outcomes of completion between large and small platforms that may occur with the DSA: all platforms increase or decrease their prices; big (small) platforms increase the price while small (big) decrease it. A priori none of these options can be ruled out and in fact they may appear simultaneously on various multisided markets. For the two asymmetric options, the consequences for market equilibrium are clear. If only the big platforms adjust their prices upwards, the DSA will have a levelling effect on the market. A smaller platform will gain more users, and will in turn attract more advertisers. Under this scenario, the DSA will increase the financial viability of the smaller platforms and will diminish the size asymmetry. If however, big platforms decrease their price while the small ones increase it, the existing differences will be further amplified leading to an even more cornered market outcome. In the third scenario with low content curation costs for all platforms, competition could result in providing higher quality of content to users at unchanged or lower prices. Such an outcome would be preferable from a social welfare perspective. It could be supported by a number of policy measures aiming at reducing the costs of moderation for all small platforms by improving access to cloud infrastructure, large training data sets and AI skills.

Naturally, strong network effects enjoyed by the dominant platforms limit contestability and competition between platforms of different sizes. It remains to be seen how externalities will affect costs of content curation and pricing of big platforms as opposed to smaller ones. The answer to this question will largely determine which market outcomes from the DSA materializes in reality.

 

V. FINAL REMARKS

We have argued that the updated liability rules introduced by the DSA may push platforms towards more intensive content screening to avoid overflow of notices. However, this will push the platforms’ marginal costs of operations upwards, as well as their prices. This pass-through effect may take different forms in practice, depending on the type of a platform and adopted business model. Similarly, this may impact competition differently, depending on size asymmetries and how the platforms modify their prices as a response to increased moderation. In the specific case in which only big platforms increase their prices, the DSA may have a pro-competition effect, by allowing smaller platforms to attract more users, and more advertisers in turn, increasing their financial viability and reducing size asymmetries.

Even if we have tried to explore some competition effects deriving from the DSA, a more in-depth analysis of the intersection between the DSA and DMA would be, in our opinion, extremely interesting and needed. For instance, the DMA links in a number of ways with the above discussion of competition effects. As an example, the DMA attempts to provide more market contestability by implementing an asymmetric prohibition for gatekeeper platforms to pool data across many services. Other obligations included in the DMA may have similar expected effects on competition.

Similarly, other recent policy initiatives in the digital domain, such as the GDPR and the Data Act also link with the DSA in mitigating the excessive data extraction from the users. In the case of the Data Act, measures promoting data sharing could have a direct effect in reducing the cost of content moderation. This can be the case if increased access to data would allow the creation larger and more curated databases which could be used to improve prediction accuracy by smaller platforms to compensate their disadvantages from weaker network effects.


[1] Joint Research Centre – European Commission. The opinions expressed in this article are the authors’ and do not necessarily reflect those of the European Commission.