Privacy regulators are increasingly looking beyond a company’s privacy policy to scrutinize the user interface of its websites, apps, and other online services, and challenging designs that they view as manipulating consumer choice. In this pursuit, regulators and privacy advocates increasingly utilize the term “dark patterns” as an umbrella concept to describe the wide array of activities that may be considered manipulative design in user interfaces. The “dark patterns” concept also provides a tool for regulators and legislators to challenge practices that they believe undermine meaningful consumer choice. In this article, we examine the developing dark pattern regulatory enforcement landscape from a data privacy perspective, with a focus on recent U.S. and EU regulatory developments.

By Christine Chong & Christine Lyon[1]

 

I. WHAT IS MEANT BY “DARK PATTERNS” IN THE PRIVACY CONTEXT?

The term “dark patterns” was reportedly coined in 2010 by Harry Brignull, a user interface designer, and the term has since been increasingly and formally adopted by privacy advocates and regulators.[2] In his original piece, Brignull suggested that deceptive user interfaces are common on the web because dark patterns may be subtle and unnoticeable: “in isolation they’re usually so small that each one is barely annoying enough for people to do anything about them.”[3] While dark patterns may be just “barely annoying” for an individual user, he noted that

...
THIS ARTICLE IS NOT AVAILABLE FOR IP ADDRESS 18.97.9.175

Please verify email or join us
to access premium content!