A dark pattern (also known as a "deceptive design pattern") is "a user interface that has been carefully crafted to trick users into doing things, such as buying overpriced insurance with their purchase or signing up for recurring bills". [1] [2] [3] User experience designer Harry Brignull coined the neologism on 28 July 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces". [4] [5] [6] In 2023 he released the book Deceptive Patterns. [7]
In 2021 the Electronic Frontier Foundation and Consumer Reports created a tip line to collect information about dark patterns from the public. [8]
"Privacy Zuckering" – named after Facebook co-founder and Meta Platforms CEO Mark Zuckerberg – is a practice that tricks users into sharing more information than they intended to.[ citation needed ] Users may give up this information unknowingly or through practices that obscure or delay the option to opt out of sharing their private information.
California has approved regulations that limit this practice by businesses in the California Consumer Privacy Act. [9]
Bait-and-switch patterns advertise a free (or at a greatly reduced price) product or service that is wholly unavailable or stocked in small quantities. After announcing the product's unavailability, the page presents similar products of higher prices or lesser quality. [10] [11]
Drip pricing is a pattern where a headline price is advertised at the beginning of a purchase process, followed by the incremental disclosure of additional fees, taxes or charges. The objective of drip pricing is to gain a consumer's interest in a misleadingly low headline price without the true final price being disclosed until the consumer has invested time and effort in the purchase process and made a decision to purchase.
Confirmshaming uses shame to drive users to act, such as when websites word an option to decline an email newsletter in a way that shames visitors into accepting. [11] [12]
Common in software installers, misdirection presents the user with a button in the fashion of a typical continuation button. A dark pattern would show a prominent "I accept these terms" button asking the user to accept the terms of a program unrelated to the one they are trying to install. [13] Since the user typically will accept the terms by force of habit, the unrelated program can subsequently be installed. The installer's authors do this because the authors of the unrelated program pay for each installation that they procure. The alternative route in the installer, allowing the user to skip installing the unrelated program, is much less prominently displayed, [14] or seems counter-intuitive (such as declining the terms of service).
Some websites that ask for information that is not required also use misdirection. For example, one would fill out a username and password on one page, and after clicking the "next" button, the page asks the user for their email address with another "next" button as the only option. [15] This hides the option to press "next" without entering the information. In some cases, the page shows the method to skip the step as a small, greyed-out link instead of a button, so it does not stand out to the user. [16] Other examples include sites offering a way to invite friends by entering their email address, to upload a profile picture, or to identify interests.
Confusing wording may be also used to trick users into formally accepting an option which they believe has the opposite meaning. For example a personal data processing consent button using a double-negative such as "don't not sell my personal information" [17]
A roach motel or a trammel net design provides an easy or straightforward path to get in but a difficult path to get out. [18] Examples include businesses that require subscribers to print and mail their opt-out or cancellation request. [10] [11]
For example, during the 2020 United States presidential election, Donald Trump's WinRed campaign employed a similar dark pattern, pushing users towards committing to a recurring monthly donation. [19]
Another common version of this pattern is any service which enables one to sign-up and start the service online, but which requires a phone call (often with long wait times) to terminate the service. Examples include services like cable TV and internet services, and credit monitoring. [20]
In 2021, in the United States, the Federal Trade Commission (FTC) has announced they will ramp up enforcement against dark patterns like roach motel that trick consumers into signing up for subscriptions or making it difficult to cancel. The FTC has stated key requirements related to information transparency and clarity, express informed consent, and simple and easy cancellation. [21]
In 2016 and 2017 research has documented social media anti-privacy practices using dark patterns. [22] [23] In 2018 the Norwegian Consumer Council (Forbrukerrådet) published "Deceived by Design," a report on deceptive user interface designs of Facebook, Google and Microsoft. [24] A 2019 study investigated practices on 11,000 shopping web sites. It identified 1818 dark patterns total and grouped them into 15 categories. [25]
Research from April 2022 found that dark patterns are still commonly used in the marketplace, highlighting a need for further scrutiny of such practices by the public, researchers and regulators. [26]
Under the European Union General Data Protection Regulation (GDPR), all companies must obtain unambiguous, freely-given consent from customers before they collect and use ("process") their personally identifiable information. A 2020 study found that "big tech" companies often used deceptive user interfaces in order to discourage their users from opting out. [27] In 2022 a report by the European Commission found that "97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern." [28]
Research on advertising network documentation shows that information presented to mobile app developers on these platforms is focused on complying with legal regulations, and puts the responsibility for such decisions on the developer. Also, sample code and settings often have privacy-unfriendly defaults laced with dark patterns to nudge developers’ decisions towards privacy-unfriendly options such as sharing sensitive data to increase revenue. [29]
Bait-and-switch is a form of fraud that violates US law. [30]
On 9 April 2019, US senators Deb Fischer and Mark Warner introduced the Deceptive Experiences To Online Users Reduction (DETOUR) Act, which would make it illegal for companies with more than 100 million monthly active users to use dark patterns when seeking consent to use their personal information. [31]
In March 2021, California adopted amendments to the California Consumer Privacy Act, which prohibits the use of deceptive user interfaces that have "the substantial effect of subverting or impairing a consumer's choice to opt-out." [17]
In October 2021, the Federal Trade Commission issued an enforcement policy statement, announcing a crackdown on businesses using dark patterns that "trick or trap consumers into subscription services." As a result of rising numbers of complaints, the agency is responding by enforcing these consumer protection laws. [21]
In 2022, New York Attorney General Letitia James fined Fareportal $2.6 million for using deceptive marketing tactics to sell airline tickets and hotel rooms [32] and the Federal Court of Australia fined Expedia Group's Trivago A$44.7 million for misleading consumers into paying higher prices for hotel room bookings. [33]
In March 2023, the United States Federal Trade Commission fined Fortnite developer Epic Games $245 million for use of "dark patterns to trick users into making purchases." The $245 million will be used to refund affected customers and is the largest refund amount ever issued by the FTC in a gaming case. [34]
In the European Union, the GDPR requires that a user's informed consent to processing of their personal information be unambiguous, freely-given, and specific to each usage of personal information. This is intended to prevent attempts to have users unknowingly accept all data processing by default (which violates the regulation). [35] [36] [37] [38] [39]
According to the European Data Protection Board, the "principle of fair processing laid down in Article 5 (1) (a) GDPR serves as a starting point to assess whether a design pattern actually constitutes a 'dark pattern'." [40]
At the end of 2023 the final version of the Data Act [41] was adopted. It is one of the three EU legislations which deal expressly with dark patterns. [42] Another one being the Digital Services Act. [43] The third EU legislation on dark patterns in force is the directive financial services contracts concluded at a distance. [44] The Public German Consumer Protection Organisation claims Big Tech uses dark patterns to violate the Digital Services Act. [45]
In April 2019, the UK Information Commissioner's Office (ICO) issued a proposed "age-appropriate design code" for the operations of social networking services when used by minors, which prohibits using "nudges" to draw users into options that have low privacy settings. This code would be enforceable under the Data Protection Act 2018. [46] It took effect 2 September 2020. [47] [48]
The Children's Online Privacy Protection Act of 1998 (COPPA) is a United States federal law, located at 15 U.S.C. §§ 6501–6506.
Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.
The Information Commissioner's Office (ICO) is a non-departmental public body which reports directly to the Parliament of the United Kingdom and is sponsored by the Department for Science, Innovation and Technology. It is the independent regulatory office dealing with the Data Protection Act 2018 and the General Data Protection Regulation, the Privacy and Electronic Communications Regulations 2003 across the UK; and the Freedom of Information Act 2000 and the Environmental Information Regulations 2004 in England, Wales and Northern Ireland and, to a limited extent, in Scotland. When they audit an organisation they use Symbiant's audit software.
A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.
Privacy and Electronic Communications Directive2002/58/EC on Privacy and Electronic Communications, otherwise known as ePrivacy Directive (ePD), is an EU directive on data protection and privacy in the digital age. It presents a continuation of earlier efforts, most directly the Data Protection Directive. It deals with the regulation of a number of important issues such as confidentiality of information, treatment of traffic data, spam and cookies. This Directive has been amended by Directive 2009/136, which introduces several changes, especially in what concerns cookies, that are now subject to prior consent.
Data portability is a concept to protect users from having their data stored in "silos" or "walled gardens" that are incompatible with one another, i.e. closed platforms, thus subjecting them to vendor lock-in and making the creation of data backups or moving accounts between services difficult.
In the middle of 2009 the Federal Trade Commission filed a complaint against Sears Holdings Management Corporation (SHMC) for unfair or deceptive acts or practices affecting commerce. SHMC operates the sears.com and kmart.com retail websites for Sears Holdings Corporation. As part of a marketing effort, some users of sears.com and kmart.com were invited to download an application developed for SHMC that ran in the background on users' computers collecting information on nearly all internet activity. The tracking aspects of the program were only disclosed in legalese in the middle of the End User License Agreement. The FTC found this was insufficient disclosure given consumers expectations and the detailed information being collected. On September 9, 2009 the FTC approved a consent decree with SHMC requiring full disclosure of its activities and destruction of previously obtained information.
In re Gateway Learning Corp, 138 F.T.C. 443 File No. 042-3047, was an investigatory action by the Federal Trade Commission (FTC) of the Gateway Learning Corporation, distributor of Hooked on Phonics. In its complaint, the FTC alleged that Gateway had committed both unfair and deceptive trade practices by violating the terms of its own privacy policy and making retroactive changes to its privacy policy without notifying its customers. Gateway reached a settlement with the FTC, entering into a consent decree in July 2004, before formal charges were filed.
Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of data derived from that activity outside the context in which it occurred.
Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.
Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.
The General Data Protection Regulation, abbreviated GDPR, or French RGPD is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.
Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.
United States v. Google Inc., No. 3:12-cv-04177, is a case in which the United States District Court for the Northern District of California approved a stipulated order for a permanent injunction and a $22.5 million civil penalty judgment, the largest civil penalty the Federal Trade Commission (FTC) has ever won in history. The FTC and Google Inc. consented to the entry of the stipulated order to resolve the dispute which arose from Google's violation of its privacy policy. In this case, the FTC found Google liable for misrepresenting "privacy assurances to users of Apple's Safari Internet browser". It was reached after the FTC considered that through the placement of advertising tracking cookies in the Safari web browser, and while serving targeted advertisements, Google violated the 2011 FTC's administrative order issued in FTC v. Google Inc.
The ePrivacy Regulation (ePR) is a proposal for the regulation of various privacy-related topics, mostly in relation to electronic communications within the European Union. Its full name is "Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC ." It would repeal the Privacy and Electronic Communications Directive 2002 and would be lex specialis to the General Data Protection Regulation. It would particularise and complement the latter in respect of privacy-related topics. Key fields of the proposed regulation are the confidentiality of communications, privacy controls through electronic consent and browsers, and cookies.
NOYB – European Center for Digital Rights is a non-profit organization based in Vienna, Austria established in 2017 with a pan-European focus. Co-founded by Austrian lawyer and privacy activist Max Schrems, NOYB aims to launch strategic court cases and media initiatives in support of the General Data Protection Regulation (GDPR), the proposed ePrivacy Regulation, and information privacy in general. The organisation was established after a funding period during which it has raised annual donations of €250,000 by supporting members. Currently, NOYB is financed by more than 4,400 supporting members.
The gathering of personally identifiable information (PII) refers to the collection of public and private personal data that can be used to identify individuals for various purposes, both legal and illegal. PII gathering is often seen as a privacy threat by data owners, while entities such as technology companies, governments, and organizations utilize this data to analyze consumer behavior, political preferences, and personal interests.
The Age appropriate design code, also known as the Children's Code, is a British internet safety and privacy code of practice created by the Information Commissioner's Office (ICO). The draft Code was published in April 2019, as instructed by the Data Protection Act 2018 (DPA). The final regulations were published on 27 January 2020 and took effect 2 September 2020, with a one-year grace period before the beginning of enforcement. The Children's Code is written to be consistent with GDPR and the DPA, meaning that compliance with the Code is enforceable under the latter.
The following is a list of laws providing an overview of laws and regulations that aim to protect consumers from microtransactions.
Consent-or-pay, also called pay-or-okay, is a compliance tactic used by certain companies, most notably Meta, to drive up the rates at which users consent to data processing under the European Union's General Data Protection Regulation (GDPR). It consists of presenting the user with a tracking consent notice, but only allowing a binary choice: either the user consents to the data processing, or they are required to pay to use the service, which is otherwise free to use if data processing is consented to. The tactic has been criticised by privacy advocates and non-governmental organisations such as NOYB and Wikimedia Europe, who claim that it is illegal under the GDPR. On 17 April 2024, the European Data Protection Board released a non-binding opinion stating that in most cases, consent-or-pay models do not constitute valid consent within the meaning of the GDPR.
... The offer is displayed on the screen, and below that a gray decline button, a green accept button ...
... you can skip it by leaving it blank.
... you need to find the tiny "Skip this step" link at the bottom right to proceed. Moreover, the link is placed outside of the blue box which ostensibly contains all relevant info or controls. ...