Deceptive and manipulative choice architectures have received significant coverage in the academic literature. These dark patterns can be nudges leading individuals to act against their interests or sludges hindering the implementation of beneficial decisions. The development of these patterns is enhanced by the potential of the data economy and by ever more powerful predictive algorithms. They raise legitimate concerns in terms of competition and consumer protection. Numerous reports suggest the introduction of regulatory measures that should be assessed based on their possible effects. This contribution shows that while these measures are necessary, it is important to emphasize that dark patterns are not the privilege of dominant operators and preventing them should not preclude the net gains that can result from the personalization of algorithmic recommendations. Dark patterns, acknowledged as manipulative practices, have been fiercely debated during the Digital Services Act negotiations. They are added to the already long list of issues facing the digital economy. But what exactly is behind them?

By Frédéric Marty & Jeanne Torregrossa[1]

 

Dark patterns, widely acknowledged to amount to manipulative practices, have been fiercely debated during the Digital Services Act negotiations. They have been added to the already long list of issues facing the digital economy. But what exactly is behind them?

The OECD provides a definition which captures the relatively broad scope of all the practices that could be covered by this term. It defines them as “user interfaces used by some online businesses to lead consumers into making decisions they would not have otherwise made if fully informed and capable of selecting alternatives.”[2]

Sketching out a more precise definition of dark patterns first requires separating them from their nearest equivalents in the “old world,” namely marketing. A deceptive interface aims to “manipulate the consumer into doing something that is inconsistent with their preferences, in contrast to marketing efforts that are designed to alter those preferences.”[3] These so-called “deceptive and manipulative” interfaces have been proliferating for years and every internet user has encountered them online.

The best-known examples are “hidden subscriptions” (“the consumer incurs a recurring fee under the pretense of a one-time fee or a free trial period”),[4] “hidden costs (“new, additional, and often unusually high charges are added just before a consumer is about to complete a purchase”),[5] or “pressured selling” (“defaults or high-pressure tactics that steer consumers into purchasing a more expensive version of a product (upselling) or related products (cross-selling)”).[6]

The academic literature has addressed this broad and multidisciplinary subject for many years now. While the initial aim was to achieve a good technical understanding of the phenomenon,[7] the aim today is to grasp its underlying mechanisms and actual impact on consumers and competition. It is therefore necessary to determine the extent of the problem and – above all – to assess, as with many new phenomena, the necessity of laying down specific regulations while guaranteeing their expected effectiveness and potential side-effects.

Mechanically, these misleading interfaces have not escaped the vigilance of the competition and regulatory authorities. The UK competition authority, the Competition and Markets Authority (“CMA”), at the vanguard on many online issues, opened an investigation in November 2022 into the online practices of the company Emma Sleep concerning so-called “pressured selling”[8] techniques. It identified the existence of time-limited urgent offers or countdowns in advertisements that would, for example, lead consumers to believe that the discount obtained would no longer be valid at the end of the indicated period, thus forcing them to make their purchase quickly without a fully informed choice.[9] This investigation is part of the CMA’s wider work to focus some of its forces on manipulative online sales practices, “Online Architecture Choice”[10] and a program to help consumers spot these sales techniques, “Rip off Tip off.”[11]

These two major UK initiatives, aimed at curbing practices while raising consumer awareness about them, echo the recent survey conducted by the European Commission and national consumer protection authorities on online sales techniques with rather alarming results: out of 399 online shops surveyed, 148 contained at least one sales technique that can be considered as a dark pattern – fake countdowns, manipulative consumer guidance or hidden information.[12]

The setting up and development of these interfaces, which are now under the scrutiny of the authorities, have, as Yeung (2017) mentions, their origin in two well-known phenomena: massive data and algorithms.[13] The author stresses that this data is collected only to become a valuable and exploitable asset, thus pointing to one of the most significant issues of the digital economy. To become valuable and exploitable, Yeung (2017) points out that these data must be inserted into a much broader combination of predictive process and information processing technology to arrive at what can be called “machine learning” creating logical links far beyond what the human mind can do.

It is no longer a question of moving into an information economy as it was previously understood, but into a prediction economy based on efficient data collection and processing. Deceptive or non-deceptive interfaces are for traditional sales techniques what targeted advertising was and still is for contextual advertising: a major disruption based on the ability to collect and exploit data.

Whether it is advertising or interfaces, the place of information in the economy is continually being redesigned, under the effect of the digitalization of the economy, to reveal some of its hitherto hidden dimensions. Whereas contextual advertising – historically used for instance in print or broadcast media – was limited to choosing the advertisement to be shown according to the context in which the advertising content was inserted, targeted advertising identifies people individually to deliver specific advertising messages to them based on their idiosyncratic characteristics. While the former technique does not require any information about the consumer, the effectiveness of the latter depends almost entirely on the level of information held about the user and its processing.

The sharing and possession of information are decisive here. They have always been the keystone of markets: the consumer must know to choose, and the company must know its consumers to offer products that meet their needs. However, they are also the subject of a very difficult balance to strike: too much information exchanged between companies – or made available – can lead to explicit or tacit collusion between them, and too much information about the consumer can jeopardize his welfare. The digital economy and the development of artificial intelligence exacerbate these issues.

In this way, considering the issues related to dark patterns is a matter of both consumer and competition protection. At consumer level, they raise issues in terms of reducing the scope of available choices and personalized and dynamic manipulation of preferences. They can give rise to practices which are even more damaging as the consumers exposed are vulnerable.[14] The lower the level of consumer expertise and information, the easier it will be to implement manipulative strategies. Not only can dark patterns enable online players to extract an additional share of consumer surplus, but they can also reduce the consumer’s ability to exercise sovereignty by hindering the comparison of offers between rival firms or to measure the costs and constraints associated with a switching decision. Dark patterns can therefore develop even more easily when consumers have already opted for single-homing strategies and when the digital ecosystem at stake presents strong immersive characteristics.

From a competition law and economics perspective, dark patterns can lead to inter-ecosystems and intra-ecosystem competition concerns.

In the context of inter-ecosystems competition, they may lessen the competitive pressure exerted by competitors and, to a certain extent, introduce the vector of unfair competition as they involve biased information on the characteristics of the products offered or manipulative techniques. In other words, to quote Rohit Chopra’s dissenting opinion in the Zoom case dealt with the FTC: “deception distorts competition.”[15] In this case, the company was accused of not respecting its commitments in terms of encrypting calls. To generalize this, we could say that the companies that make the most use of dark patterns could have a competitive advantage over their competitors. The incentives would then move towards a downward alignment: the large ecosystems would all have an interest in unilaterally making their offerings less transparent and more confusing for consumers.[16]

As for intra-ecosystem competition, dark patterns can reinforce the effectiveness of self-preferencing strategies by drawing consumers towards a particular offer. They could therefore effectively make it possible either to exclude an as-efficient and possibly more attractive competitor, by artificially reducing its visibility or by diverting consumers from its offer,[17] or to implement exploitative strategies by forcing some of its commercial partners to contract for additional services to escape a possible demotion, which is particularly difficult to evidence in litigation.[18]

Two examples of such architectures and their impacts can be mentioned. Firstly, drip-pricing practices are well known, and their effects have long been evaluated in the academic literature, as shown by the work of Blake et al. published in 2021.[19] The latter showed through an experiment that abandoning such strategies can lead to a 28 percent loss of revenue for an online vendor. Secondly, in the domain of retail banking fees, a White House press release of February 1, 2023 on the proposed Junk Fee Prevention Act illustrates the burden of these “Unfair and Costly Junk Fees” on the most vulnerable consumers who are most exposed to manipulative practices.[20] In the field of banking services, two reports published in 2021 by the CFPB (Consumer Financial Protection Bureau) show that not only do these unanticipated fees have a significant impact on consumer welfare, they also reduce competition between banking institutions by impeding the transparency necessary for price competition.[21]

All of these factors demonstrate that there is a legitimate concern surrounding dark patterns, but this should not obscure a certain number of risks and limits that need to be taken into consideration in terms of public policy design.

Firstly, personalization is not a competitive problem as such. Personalized recommendations, especially based on algorithmic predictions grounded on massive data collection and processing, contribute to economic efficiency and consumer satisfaction. Directing consumers towards a particular choice can reduce transaction costs and collectively lead to efficiency gains through volume or scale effects. Secondly, nudges and sludges can have desirable effects not only collectively but also individually. They can help to counteract existing biases in favor of the usual suppliers. They can thus help to defend consumers against themselves, for example when they exhibit addictive behavior or excessive aversion to change, which may lead them not to seek out competition when they should. It can help to overcome consumer inertia.[22]

Secondly, dark patterns are not the exclusive privilege of dominant digital firms. They may be implemented in brick-and-mortar stores (albeit with less efficiency and refinement). They can also be implemented by non-dominant operators. Indeed, dark patterns can be developed by operators who do not have a data advantage or specific artificial intelligence capabilities. Dark patterns expose consumers to the risk of being harmed by non-dominant market players.

While it is therefore legitimate to be concerned about dark patterns, possible remedies should be carefully considered.

Firstly, dark patterns are not exclusive to “gatekeepers” in the sense of the Digital Markets Act. They can hardly be remedied by asymmetric regulation. However, any symmetrical regulation can have a negative effect on competition insofar as the costs of compliance weigh relatively more on small players than on large ones. This is the case for the GDPR and will be even more so with the interoperability requirements contained in the draft Data Act. Overly intrusive regulation that is imposed on all players may have the effect of strengthening the competitive position of the most powerful.

Secondly, even from the sole perspective of consumer protection, the prevention and sanctioning of dark patterns require substantial means of investigation. While blatantly manipulative procedures must be prohibited per se, a balancing approach is necessary for certain patterns in that the personalization of the offer can only be envisaged through an effects-based approach.

Thirdly, a socially responsible company, regarding all its stakeholders and more precisely its most vulnerable consumers, could refrain from implementing commercial practices based on the delivery of biased information or manipulative choice architecture. The absence of dark patterns could therefore be integrated into an ethical approach and a compliance policy. These can respond to the intrinsic motivations of the firm but also to extrinsic motivations linked to the possible reputational cost that could result from the exposure of such practices and their effects. Within this framework, the recommendations formulated as regards algorithmic liability could be extended to dark patterns:[23] A firm that implements an algorithm has a clear interest in investing in risk prevention both ex ante and as it is used. Procedures involving the certification of choice architectures and periodic audits could be part of self-regulation measures that complement public supervision policies that expose firms when they are not very careful about how the effects of their practices could lead to sanctions.


[1] Respectively CNRS – GREDEG – Université Côte d’Azur ; OFCE – Sciences Po., Paris ; CIRANO, Montréal and Altermind.

[2] OCDE, Roundtable on Dark Commercial Patterns Online, Summary of discussion, (February 19, 2021).

[3] Jamie Luguri & Lior J. Strahilevitz, Shining a Light on Dark Patterns, Journal of Legal Analysis, 13(1), pp.43–109, (2021).

[4] OCDE, Roundtable on Dark Commercial Patterns Online, Summary of discussion, (February 19, 2021).

[5] OCDE, Roundtable on Dark Commercial Patterns Online, Summary of discussion, (February 19, 2021).

[6] OCDE, Roundtable on Dark Commercial Patterns Online, Summary of discussion, (February 19, 2021).

[7] Michael Toth, Nataliia Bielova & Vincent Roca, On dark patterns and manipulation of website publishers by CMPs, Proceedings on Privacy Enhancing Technologies (PoPETs), pp.478–497, (2022).

[8] OCDE, Roundtable on Dark Commercial Patterns Online, Summary of discussion, (February 19, 2021).

[9] Press Release, Competition and Markets Authority, CMA investigates online selling practices based on ‘urgency’ claims (November 30, 2022).

[10] Competition and Markets Authority, Online choice architecture work (November 30, 2022).

[11] Press Release, Competition and Markets Authority, 7 out of 10 people have experienced potential rip-offs online, worrying new CMA research reveals (February 9, 2022).

[12] Press Release, European Commission, Consumer protection: manipulative online practices found on 148 out of 399 online shops screened (January 30, 2023).

[13] Karen Yeung, ‘Hypernudge’: Big Data as a mode of regulation by design, Information, Communication & Society, 20(1), pp.1–19 (2017).

[14] Renu Isidore R. & Christie P., The relationship between the income and behavioural biases, Journal of Economics, Finance and Administrative Science, 24 (47), pp.127–144 (2019).

[15] Federal Trade Commission, Dissenting Statement of Commissioner Rohit Chopra Regarding Zoom Video Communications, Inc., (November 6, 2020).

[16] Robert Edwards, Pricing and obfuscation with complexity adverse consumers, Oxford Economic Papers, 71(3), pp.777–798, (2019).

[17] Patrice Bougette, Axel Gautier & Frédéric Marty, Business Models and Incentives: For an Effects-Based Approach of Self-Preferencing?, Journal of Competition Law and Practice, 13(2), pp.136–143, (2022).

[18] Frédéric Marty, From Demoting to Squashing? Competitive Issues Related to Algorithmic Corrections: An Application to the Search Advertising Sector, Competition Policy International (April 2019), https://www.competitionpolicyinternational.com/wp-content/uploads/2019/04/CPI-Marty.pdf.

[19] Tom Blake, Sarah Moshary, Kane Sweeney & Steve Tadelis, Price Salience and Product Choice, Marketing Science, 40(4), pp. 619–636 (2021).

[20] The White House, Fact Sheet : President Biden highlights new progress on his competition agenda, (February 1, 2023).

[21] Consumer Financial Protection Bureau, Office of Research Publication, Data Point: Overdraft/NSF Fee Reliance Since 2015 – Evidence from Bank Call Reports, (December 1, 2021).

[22] Competition and Markets Authority, Tackling the loyalty Penalty, (September 28, 2018).

[23] Nathalie De Marcellis-Warin, Frédéric Marty, Eva Thelisson & Thierry Warin, Artificial intelligence and consumer manipulations: from consumer’s counter algorithms to firm’s self-regulation tools, AI & Ethics, 2(2), pp.259–268, (2022).