There is a widespread consensus that design practices involving psychological manipulation and deceit should be banned. However, when it comes to defining the concept of “dark patterns,” the challenge is to identify the line that separates legitimate user interface design from deceptive practices. It is crucial to have clear guidance based on robust research of what might constitute a dark pattern, assessing on a case-by-case basis the real impact and intention behind a practice. It is important to distinguish online persuasive design practices from deceptive ones to ensure the same commercial rights are granted to online businesses as to brick-and-mortar ones. Any initiative must be limited to “dark patterns” that are illegitimate. Regulators should not go for the easy way out and standardize online interfaces. A one-size-fits-all approach would not work for the variety of online services and harm competition among similar brands. In Europe, there is a well-equipped consumer acquis addressing “dark patterns.” Instead of adding another layer of measures, policymakers should focus on better and more consistent enforcement of existing rules.

By Victoria de Posson[1]

 

I. INTRODUCTION

Practices such as pop-ups offering “free prizes,” false countdown timers promoting special deals, and automatic billing after a free trial without prior notification do not only manipulate users, but also significantly deteriorate their online experience. Many businesses already avoid such misleading or unfair commercial practices, in line with prohibitions under legislation. Nevertheless, in the policy debate these practices are resurfacing under a new label: “dark patterns.”

This article aims to examine in more detail the concept of “dark patterns” and the necessity for their regulation. It will begin by exploring the origin and definition of the term, comparing online and offline techniques, and evaluating the need for flexible design interface rules. Finally, this article will take a closer look at the regulations in the European Union (“EU”), as it is widely recognized as a global leader in regulating the online sphere and protecting consumers.

 

II. DARK PATTERNS: ORIGIN AND DEFINITION OF THE TERM

As politicians seek to ban “dark patterns,” it is crucial to establish a clear definition of what constitutes a “dark pattern.” This will ensure that consumers are safeguarded against misleading practices while simultaneously avoiding any hindrance to the development of intuitive and user-friendly interfaces that serve legitimate purposes.

The terminology of “dark patterns” was first coined in 2010 by English user experience specialist Dr. Harry Brignull, who holds a PhD in Cognitive Science. Brignull defines “dark patterns” as “tricks used in websites and apps that make you do things that you didn’t mean to.”[2]

When it comes to defining the concept of “dark patterns,” the challenge is to identify the line that separates legitimate user interface design from deceptive practices. Over the last few years, the use of the “dark patterns” term is moving further and further away from Brignull’s initial definition. It has become a catch-all term encompassing commercial practices that include some legitimate business marketing practices.

For instance, the pressure to ban consumer reminders of their previous choices through interfaces, which can be a valid and well-intentioned practice. The choices presented can vary based on the time and context, reflecting different use cases and intentions. Users should have the ability to revisit their choices when there is a clear demand or user interest. This could include situations where users are asked to review their privacy settings periodically.[3]

“Dark patterns” would be better defined as design choices that intentionally distort the behavior of the average user for manipulative purposes. Prohibitions should not target practices that are made in good faith and have a legitimate purpose or are justified in specific situations. For example, requests for location access to improve user preferences or awareness tools that enhance safety and privacy should be allowed.

Measures must be limited to “dark patterns” that are illegitimate in any scenario and tackle the issue comprehensively across the internet. Given the inherent vagueness of the concept and its lack of legal foundation, it is crucial to have clear guidance based on robust research on what might constitute a dark pattern. Sufficient flexibility should be left for a case-by-case assessment of the real impact and intention behind a practice.

 

III. DARK PATTERNS: ONLINE AND OFFLINE MARKETING TECHNIQUES

An outdated perception is that online businesses and platforms are often associated with a tendency to manipulate customers. This view stems from an inaccurate belief that the digital world is still unregulated and chaotic and is more representative of when the internet emerged rather than where it is today. Despite the significant increase in regulatory texts on online practices in recent years, with the motto “what is forbidden offline must be forbidden online,” remnants of this fear of the digital world are still evident. This perception highly penalizes online businesses compared to brick-and-mortar ones, in particular when it comes to the ambiguous notion of “dark patterns.” Indeed, the desire to create additional regulation marks a turning point as marketing practices that are legal offline are becoming illegal online.

Visual merchandising in physical marketing in the offline world is the equivalent to website design marketing in the online world. It involves strategically presenting, arranging, and displaying merchandise in stores to attract customers and boost sales. This concept was initially introduced in the retail industry in 1883 by Harry Gordon Selfridge, an American entrepreneur who established Selfridges, a London-based department store.[4]

Some of the practices that are called out for being “dark patterns” are actually visual merchandising techniques used by brick-and-mortar retail. For example, interface designs that highlight or lowlight certain information or sections of a website correspond to visual techniques used by stores when displaying products. The choice of the location of products on a shelf, or the location of the shelf itself in a store is purely strategic marketing. The display of popular products at the bottom or at the top of a shelf instead of at eye level has never been called out for being a “hidden in plain sight” deceptive commercial practice.[5] The same goes for the de-emphasis of a product displayed with multiple other products on a shelf, which has never been considered a “too many options” deceptive practice.[6] The way a product is displayed and emphasized or not, based on factors such as its location, the use of color contrasts, or neon lighting, is a legitimate marketing technique in physical retail.

Another example would be an interface with messages pointing out limited time for a promotion, countdowns, or information on stock and quantity. The same type of messages can be found on the windows of stores. Words, colors, and illustrations are strategically used to encourage passers-by to enter shops. The same goes for messages on ongoing or soon-to-end promotions strategically displayed inside the store on shelves and walls, or even orally announced to customers.

Where these are legitimate, these visual commercial techniques are accepted for physical retail and the same should stand for the digital world. Online persuasive design practices should be distinguished from deceptive ones in order to ensure the same commercial rights to online businesses as to brick-and-mortar ones but also to ensure the best online user experience.

 

IV. DARK PATTERNS: NEED FOR FLEXIBLE DESIGN INTERFACE RULES

It is evident that practices that deceive or mistreat consumers should be prohibited. Regulators should not take the easy way out by standardizing online interfaces. Instead they should enable the best consumer experience online and foster a competitive and innovative environment for businesses incentivizing creativity.

Differentiation of online interfaces and visual elements is crucial for businesses to establish their brand identity and for users to identify and distinguish between brands. This distinction is vital for business success and optimal user experience. Implementing a standardized approach could limit freedom of enterprise and innovation, creating a homogenous online landscape.

A standardized interface would also be detrimental to the consumer experience, as a one-size-fits-all approach would not work for most services, particularly emerging ones. For instance, for some services it makes sense to have a consumer support access on their homepage, while for other services it should be under a separate page, like a support page, as their home page is intentionally minimalist to benefit consumers’ experience. Regulators must keep a flexible approach that takes into account the variety of online business models and allows businesses to implement rules that make sense for their services and products. Otherwise, their well-intentioned efforts may be counterproductive and harm the customer journey on the website, undermining the overall customer experience.

To protect entrepreneurship and ensure the best user experience, regulations on interface design need to offer flexibility, adaptability, and follow a case-by-case approach.

 

V. DARK PATTERNS: FOCUS ON THE EUROPEAN UNION

A. The Web of EU Rules

Let’s examine the regulations on “dark patterns” in the European Union (“EU”), the world’s leader in regulating the online world and protecting consumers. Various EU initiatives, including the 2005 Directive on Unfair Commercial Practices (“UCPD”), the 2011 Directive on Consumers Rights, and the 2016 Regulation on General Data Protection (“GDPR”), cover the concept of “dark pattern” techniques by referring to misleading and unfair commercial practices.[7]

The term “dark patterns” was first introduced in an EU text in a study titled “Behavioral study on unfair commercial practices in the digital environment, Dark patterns and manipulative personalization,” conducted by the EU Directorate-General for Justice and Consumers in 2016.[8] The report defined “dark patterns” as “a concept that is generally used to refer to practices in digital interfaces that steer, deceive, coerce, or manipulate consumers into making choices that often are not in their best interests.”[9] This report sparked the interest of European regulators in “dark patterns.”

The EU further protects its consumers from deceptive practices by updating its legislative framework including the 2019 Directive on better enforcement and modernization of Union consumer protection rules (also known as the “Omnibus Directive”) and the 2021 Guidance on unfair business-to-consumer commercial practices in the internal market.[10]

The recently adopted Digital Services Act (“DSA”) is the first EU regulation to define the term “dark patterns.” It describes it as “practices on online interfaces of online platforms that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions.”[11]

The term “dark patterns” has also been introduced into ongoing legislative proposals, such as the General Product Safety Regulation (“GPSR”), the Empowering Consumers for the Green Transition Directive, and the Distance Marketing and Financial Services Directive. [12]

Recently, the European Commission launched a Fitness Check of EU consumer law on digital fairness to evaluate existing regulations and their adequacy for ensuring a high level of online consumer protection.[13] This initiative could lead to new rules on “dark patterns.”

However, before considering new EU consumer legislation, policymakers should assess the consistency of the Omnibus Directive, which has only been implemented since May 2022, and other EU consumer protection measures enforced across the EU Single Market. Sufficient time should be allowed for the rules to produce their intended effects before once more amending the rulebook.

Instead of introducing new provisions for “dark patterns,” clarifying guidelines would be a reasonable next step, as outlined in the DSA, to ensure alignment, coherence, and consistency between existing and future legislation. This is particularly important due to the multitude of digital business models and sector-specific requirements. It is also crucial to prevent any overlap or inconsistency in the regulations that could create legal uncertainty for businesses and consumers.

B. Enforcement

The real problem with unfair commercial practices is not the lack of sufficient regulation, but the enforcement of existing rules. In Europe, enforcement should equally target all companies interacting with EU consumers, irrespective of their country of origin or their online or offline origin. Selectively enforcing rules on certain players but being less focused on others is detrimental to consumer protection and can also create market distortions. This would be the case, for example, under the new DSA and GPSR obligations for marketplaces which do not apply to extra-EU retailers.

A more harmonized approach to the implementation of consumer protection legislation is also needed to ensure coherent and consistent enforcement of EU rules, given the cross-border operations of businesses. Divergent interpretations and enforcement lead to uneven consumer standards across EU Member States, generating legal uncertainty for businesses and constraining their potential on cross-border trade. Effective collaboration among Member States (e.g. via the Consumer Protection Cooperation Network) can help ensure more uniformity in the interpretation and enforcement of EU rules.

The EU legislator should incentivize Member States’ sectoral authorities (e.g. consumer, competition, data protection, and telecommunication authorities) to better cooperate to ensure pro-innovation as well as a coherent and harmonized application of EU rules. A holistic approach at national level should be adopted. In other words, silos and diverging interpretations must be avoided within the same country and among EU Member States.

 

VI. CONCLUSION

“Dark patterns” are design choices intentionally made to manipulate the average user’s behavior for deceptive purposes. The term was first coined in 2010 by Harry Brignull, but its definition has since expanded to encompass even some legitimate business marketing practices. The challenge therefore lies in identifying the line that separates legitimate user interface design from deceptive practices, which is why clear examples of “dark patterns” supported by robust research are crucial.

Although online businesses and platforms are often associated with a tendency to manipulate customers, it is important to distinguish online persuasive design practices from deceptive ones to ensure the same commercial rights to online businesses as to brick-and-mortar ones. Measures must be limited to “dark patterns” that are illegitimate in any scenario and tackle the issue comprehensively across the internet.

It is important to avoid taking the easy way out and standardizing online interfaces, as differentiation of online interfaces and visual elements is crucial for businesses to establish their brand identity and for users to identify and distinguish between brands.

In Europe, regular assessment of consumer protection rights is to be welcomed. However, before adding another layer to the already well-equipped consumer acquis, EU policymakers should focus on better and more consistent enforcement of existing rules and allow time for these rules to take effect. That said, EU guidance would be welcomed in areas where EU rules overlap and/or conflict, as this would also support a more coherent and uniform interpretation and enforcement of the rules across the EU.


[1] Victoria is the Secretary General of the European Tech Alliance (“EUTA”) which gathers major European digital champions and scaleups successfully built across Europe. The EUTA aims to develop smart policies promoting European tech innovation, investments, and competitiveness. See https://eutechalliance.eu.

[2] Harry Brignull, What are deceptive patterns?, April 14, 2023, accessible at: https://www.deceptive.design/.

>[3] Article 29 Working Party, Guidelines on transparency under Regulation 2016/679, November 29, 2017, accessible at: https://ec.europa.eu/newsroom/article29/redirection/document/51025.

[4] Johnson & Wales University, How Visual Merchandising Serves as Marketing: Understanding the Impact Across Industries, April 12, 2023, accessible at: https://online.jwu.edu/blog/how-visual-merchandising-serves-marketing-understanding-impact-across-industries.

[5] European Data Protection Board, Guidelines 3/2022 on Dark patterns in social media platform interfaces: How to recognise and avoid them, March 14, 2022, p. 66, accessible at: https://edpb.europa.eu/system/files/2022-03/edpb_03-2022_guidelines_on_dark_patterns_in_social_media_platform_interfaces_en.pdf.  

[6] Op. cit. p. 67.

[7] EU Directive 2005/29/EC concerning unfair business-to-consumer commercial practices in the internal market (‘Unfair Commercial Practices Directive’) (2005), Official Journal L 149, p. 22–39; EU Directive 2011/83/EU on consumer rights (2011), Official Journal L 304, p. 64–88; EU Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) (2016), L 119, p. 1–88

[8] European Commission, Behavioural study on unfair commercial practices in the digital environment

Dark patterns and manipulative personalisation: final report, April 2022, accessible at: https://op.europa.eu/en/publication-detail/-/publication/606365bc-d58b-11ec-a95f-01aa75ed71a1/language-en/format-PDF/source-257599418.  

[9] Op. cit., p. 20.

[10] EU Guidance C/2021/9320 on the interpretation and application of Directive 2005/29/EC of the European Parliament and of the Council concerning unfair business-to-consumer commercial practices in the internal market (2021), OJ C 526, p. 1–129.; EU Directive (EU) 2019/2161 on the better enforcement and modernisation of Union consumer protection rules (2019), L 328, p. 7–28.

[11] EU Regulation 2022/2065 on a Single Market For Digital Services (Digital Services Act) (2022), L 277, p. 1–102.

[12] European Commission, Proposal for a Directive concerning financial services contracts concluded at a distance and repealing, COM/2022/204 final; European Commission, Proposal for a Directive empowering consumers for the green transition through better protection against unfair practices and better information, COM/2022/143 final; European Commission, Proposal for a Directive concerning financial services contracts concluded at a distance and repealing Directive 2002/65/EC, COM/2022/204 final.

[13] European Commission, Digital fairness – fitness check on EU consumer law, April 19, 2023, accessible at: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13413-Digital-fairness-fitness-check-on-EU-consumer-law_en.