Dark pattern

Last updated

A dark pattern (also known as a "deceptive design pattern") is "a user interface that has been carefully crafted to trick users into doing things, such as buying overpriced insurance with their purchase or signing up for recurring bills". [1] [2] [3] User experience designer Harry Brignull coined the neologism on 28 July 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces". [4] [5] [6]

Contents

In 2021 the Electronic Frontier Foundation and Consumer Reports created a tip line to collect information about dark patterns from the public. [7]

Patterns

Privacy Zuckering

"Privacy Zuckering" – named after Facebook co-founder and Meta Platforms CEO Mark Zuckerberg – is a practice that tricks the user into sharing more information than they intended to. [8] Users may give up this information unknowingly or through practices that obscure or delay the option to opt out of sharing their private information.

California has approved regulations that limit this practice by businesses in the California Consumer Privacy Act. [9]

Bait-and-switch

Bait-and-switch patterns advertise a free (or at a greatly reduced price) product or service that is wholly unavailable or stocked in small quantities. After announcing the product's unavailability, the page presents similar products of higher prices or lesser quality. [10] [11]

Confirmshaming

Confirmshaming uses shame to drive users to act, such as when websites word an option to decline an email newsletter in a way that shames visitors into accepting. [11] [12]

Misdirection

Common in software installers, misdirection presents the user with a button in the fashion of a typical continuation button. A dark pattern would show a prominent "I accept these terms" button asking the user to accept the terms of a program unrelated to the one they are trying to install. [13] Since the user typically will accept the terms by force of habit, the unrelated program can subsequently be installed. The installer's authors do this because the authors of the unrelated program pay for each installation that they procure. The alternative route in the installer, allowing the user to skip installing the unrelated program, is much less prominently displayed, [14] or seems counter-intuitive (such as declining the terms of service).

Some websites that ask for information that is not required also use misdirection. For example, one would fill out a username and password on one page, and after clicking the "next" button, the page asks the user for their email address with another "next" button as the only option. [15] This hides the option to press "next" without entering the information. In some cases, the page shows the method to skip the step as a small, greyed-out link instead of a button, so it does not stand out to the user. [16] Other examples include sites offering a way to invite friends by entering their email address, to upload a profile picture, or to identify interests.

Confusing wording may be also used to trick users into formally accepting an option which they believe has the opposite meaning. For example a personal data processing consent button using a double-negative such as "don't not sell my personal information" [17]

Roach motel

A roach motel or a trammel net design provides an easy or straightforward path to get in but a difficult path to get out. [18] Examples include businesses that require subscribers to print and mail their opt-out or cancellation request. [10] [11]

For example, during the 2020 United States presidential election, Donald Trump's WinRed campaign employed a similar dark pattern, pushing users towards committing to a recurring monthly donation. [19]

Another common version of this pattern is any service which enables one to sign-up and start the service online, but which requires a phone call (often with long wait times) to terminate the service. Examples include services like cable TV and internet services, and credit monitoring.[ citation needed ]

In 2021, in the United States, the Federal Trade Commission (FTC) has announced they will ramp up enforcement against dark patterns like roach motel that trick consumers into signing up for subscriptions or making it difficult to cancel. The FTC has stated key requirements related to information transparency and clarity, express informed consent, and simple and easy cancellation. [20]

Research

In 2016 and 2017 research has documented social media anti-privacy practices using dark patterns. [21] [22] In 2018 the Norwegian Consumer Council (Forbrukerrådet) published "Deceived by Design," a report on deceptive user interface designs of Facebook, Google and Microsoft. [23] A 2019 study investigated practices on 11,000 shopping web sites. It identified 1818 dark patterns total and grouped them into 15 categories. [24]

Research from April 2022 found that dark patterns are still commonly used in the marketplace, highlighting a need for further scrutiny of such practices by the public, researchers and regulators. [25]

Under the European Union General Data Protection Regulation (GDPR), all companies must obtain unambiguous, freely-given consent from customers before they collect and use ("process") their personally identifiable information. A 2020 study found that "big tech" companies often used deceptive user interfaces in order to discourage their users from opting out. [26] In 2022 a report by the European Commission found that "97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern." [27]

Research on advertising network documentation shows that information presented to mobile app developers on these platforms is focused on complying with legal regulations, and puts the responsibility for such decisions on the developer. Also, sample code and settings often have privacy-unfriendly defaults laced with dark patterns to nudge developers’ decisions towards privacy-unfriendly options such as sharing sensitive data to increase revenue. [28]

Legality

United States

Bait-and-switch is a form of fraud that violates US law. [29]

On 9 April 2019, US senators Deb Fischer and Mark Warner introduced the Deceptive Experiences To Online Users Reduction (DETOUR) Act, which would make it illegal for companies with more than 100 million monthly active users to use dark patterns when seeking consent to use their personal information. [30]

In March 2021, California adopted amendments to the California Consumer Privacy Act, which prohibits the use of deceptive user interfaces that have "the substantial effect of subverting or impairing a consumer's choice to opt-out." [17]

In October 2021, the Federal Trade Commission issued an enforcement policy statement, announcing a crackdown on businesses using dark patterns that "trick or trap consumers into subscription services." As a result of rising numbers of complaints, the agency is responding by enforcing these consumer protection laws. [20]

In 2022, New York Attorney General Letitia James fined Fareportal $2.6 million for using deceptive marketing tactics to sell airline tickets and hotel rooms [31] and the Federal Court of Australia fined Expedia Group's Trivago A$44.7 million for misleading consumers into paying higher prices for hotel room bookings. [32]

In March 2023, the United States Federal Trade Commission fined Fortnite developer Epic Games $245 million for use of "dark patterns to trick users into making purchases." The $245 million will be used to refund affected customers and is the largest refund amount ever issued by the FTC in a gaming case. [33]

European Union

In the European Union, the GDPR requires that a user's informed consent to processing of their personal information be unambiguous, freely-given, and specific to each usage of personal information. This is intended to prevent attempts to have users unknowingly accept all data processing by default (which violates the regulation). [34] [35] [36] [37] [38]

According to the European Data Protection Board, the "principle of fair processing laid down in Article 5 (1) (a) GDPR serves as a starting point to assess whether a design pattern actually constitutes a 'dark pattern'." [39]

At the end of 2023 the final version of the Data Act [40] was adopted. It is one of the three EU legislations which deal expressly with dark patterns. [41] The other one being the Digital Services Act. [42] The third EU legislation on dark patterns in force is the directive financial services contracts concluded at a distance. [43]

United Kingdom

In April 2019, the UK Information Commissioner's Office (ICO) issued a proposed "age-appropriate design code" for the operations of social networking services when used by minors, which prohibits using "nudges" to draw users into options that have low privacy settings. This code would be enforceable under the Data Protection Act 2018. [44] It took effect 2 September 2020. [45] [46]

See also

Related Research Articles

<span class="mw-page-title-main">Children's Online Privacy Protection Act</span> American federal cyber law in 2000

The Children's Online Privacy Protection Act of 1998 (COPPA) is a United States federal law, located at 15 U.S.C. §§ 65016506.

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

<span class="mw-page-title-main">Information Commissioner's Office</span> Non-departmental public body

The Information Commissioner's Office (ICO) is a non-departmental public body which reports directly to the Parliament of the United Kingdom and is sponsored by the Department for Science, Innovation and Technology. It is the independent regulatory office dealing with the Data Protection Act 2018 and the General Data Protection Regulation, the Privacy and Electronic Communications Regulations 2003 across the UK; and the Freedom of Information Act 2000 and the Environmental Information Regulations 2004 in England, Wales and Northern Ireland and, to a limited extent, in Scotland. When they audit an organisation they use Symbiant's audit software.

A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.

Privacy law is a set of regulations that govern the collection, storage, and utilization of personal information from healthcare, governments, companies, public or private entities, or individuals.

ePrivacy Directive

Privacy and Electronic Communications Directive2002/58/EC on Privacy and Electronic Communications, otherwise known as ePrivacy Directive (ePD), is an EU directive on data protection and privacy in the digital age. It presents a continuation of earlier efforts, most directly the Data Protection Directive. It deals with the regulation of a number of important issues such as confidentiality of information, treatment of traffic data, spam and cookies. This Directive has been amended by Directive 2009/136, which introduces several changes, especially in what concerns cookies, that are now subject to prior consent.

Data portability is a concept to protect users from having their data stored in "silos" or "walled gardens" that are incompatible with one another, i.e. closed platforms, thus subjecting them to vendor lock-in and making the creation of data backups or moving accounts between services difficult.

<span class="mw-page-title-main">FTC regulation of behavioral advertising</span> US Regulations on Advertising Targeted by Online Activity

The United States Federal Trade Commission (FTC) has been involved in oversight of the behavioral targeting techniques used by online advertisers since the mid-1990s. These techniques, initially called "online profiling", are now referred to as "behavioral targeting"; they are used to target online behavioral advertising (OBA) to consumers based on preferences inferred from their online behavior. During the period from the mid-1990s to the present, the FTC held a series of workshops, published a number of reports, and gave numerous recommendations regarding both industry self-regulation and Federal regulation of OBA. In late 2010, the FTC proposed a legislative framework for U.S. consumer data privacy including a proposal for a "Do Not Track" mechanism. In 2011, a number of bills were introduced into the United States Congress that would regulate OBA.

In the middle of 2009 the Federal Trade Commission filed a complaint against Sears Holdings Management Corporation (SHMC) for unfair or deceptive acts or practices affecting commerce. SHMC operates the sears.com and kmart.com retail websites for Sears Holdings Corporation. As part of a marketing effort, some users of sears.com and kmart.com were invited to download an application developed for SHMC that ran in the background on users' computers collecting information on nearly all internet activity. The tracking aspects of the program were only disclosed in legalese in the middle of the End User License Agreement. The FTC found this was insufficient disclosure given consumers expectations and the detailed information being collected. On September 9, 2009 the FTC approved a consent decree with SHMC requiring full disclosure of its activities and destruction of previously obtained information.

In re Gateway Learning Corp, 138 F.T.C. 443 File No. 042-3047, was an investigatory action by the Federal Trade Commission (FTC) of the Gateway Learning Corporation, distributor of Hooked on Phonics. In its complaint, the FTC alleged that Gateway had committed both unfair and deceptive trade practices by violating the terms of its own privacy policy and making retroactive changes to its privacy policy without notifying its customers. Gateway reached a settlement with the FTC, entering into a consent decree in July 2004, before formal charges were filed.

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.

Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.

<span class="mw-page-title-main">General Data Protection Regulation</span> EU regulation on the processing of personal data

The General Data Protection Regulation is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.

<span class="mw-page-title-main">Chris Hoofnagle</span>

Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.

<i>United States v. Google Inc.</i>

United States v. Google Inc., No. 3:12-cv-04177, is a case in which the United States District Court for the Northern District of California approved a stipulated order for a permanent injunction and a $22.5 million civil penalty judgment, the largest civil penalty the Federal Trade Commission (FTC) has ever won in history. The FTC and Google Inc. consented to the entry of the stipulated order to resolve the dispute which arose from Google's violation of its privacy policy. In this case, the FTC found Google liable for misrepresenting "privacy assurances to users of Apple's Safari Internet browser". It was reached after the FTC considered that through the placement of advertising tracking cookies in the Safari web browser, and while serving targeted advertisements, Google violated the 2011 FTC's administrative order issued in FTC v. Google Inc.

The ePrivacy Regulation (ePR) is a proposal for the regulation of various privacy-related topics, mostly in relation to electronic communications within the European Union. Its full name is "Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC ." It would repeal the Privacy and Electronic Communications Directive 2002 and would be lex specialis to the General Data Protection Regulation. It would particularise and complement the latter in respect of privacy-related topics. Key fields of the proposed regulation are the confidentiality of communications, privacy controls through electronic consent and browsers, and cookies.

<span class="mw-page-title-main">NOYB</span> European data protection advocacy group

NOYB – European Center for Digital Rights is a non-profit organization based in Vienna, Austria established in 2017 with a pan-European focus. Co-founded by Austrian lawyer and privacy activist Max Schrems, NOYB aims to launch strategic court cases and media initiatives in support of the General Data Protection Regulation (GDPR), the proposed ePrivacy Regulation, and information privacy in general. The organisation was established after a funding period during which it has raised annual donations of €250,000 by supporting members. Currently, NOYB is financed by more than 4,400 supporting members.

The gathering of personally identifiable information (PII) is the practice of collecting public and private personal data that can be used to identify an individual for both legal and illegal applications. PII owners often view PII gathering as a threat and violation of their privacy. Meanwhile, entities such as information technology companies, governments, and organizations use PII for data analysis of consumer shopping behaviors, political preference, and personal interests.

The Age appropriate design code, also known as the Children's Code, is a British internet safety and privacy code of practice created by the Information Commissioner's Office (ICO). The draft Code was published in April 2019, as instructed by the Data Protection Act 2018 (DPA). The final regulations were published on 27 January 2020 and took effect 2 September 2020, with a one-year grace period before the beginning of enforcement. The Children's Code is written to be consistent with GDPR and the DPA, meaning that compliance with the Code is enforceable under the latter.

The following is a list of laws providing an overview of laws and regulations that aim to protect consumers from microtransactions.

References

  1. Campbell-Dollaghan, Kelsey (21 December 2016). "The Year Dark Patterns Won". CO.DESIGN. Retrieved 29 May 2017.
  2. Singer, Natasha (14 May 2016). "When Websites Won't Take No For An Answer". The New York Times. Retrieved 29 May 2017.
  3. Nield, David (4 April 2017). "Dark Patterns: The Ways Websites Trick Us Into Giving Up Our Privacy". Gizmodo. Retrieved 30 May 2017.
  4. Brignull, Harry (1 November 2011). "Dark Patterns: Deception vs. Honesty in UI Design". A List Apart. Retrieved 29 May 2017.
  5. Grauer, Yael (28 July 2016). "Dark Patterns Are Designed to Trick You, and They're All Over the Web". Ars Technica. Retrieved 29 May 2017.
  6. Fussell, Sidney, The Endless, Invisible Persuasion Tactics of the Internet , The Atlantic , 2 August 2019
  7. Release, Press (19 May 2021). "Coalition Launches 'Dark Patterns' Tip Line to Expose Deceptive Technology Design". Electronic Frontier Foundation . Archived from the original on 19 May 2021. Retrieved 27 May 2021.
  8. "Dark Patterns - Types of Dark Pattern". www.darkpatterns.org. Retrieved 13 December 2021.
  9. "Attorney General Becerra Announces Approval of Additional Regulations That Empower Data Privacy Under the California Consumer Privacy Act". State of California - Department of Justice - Office of the Attorney General. 15 March 2021. Retrieved 13 December 2021.
  10. 1 2 Snyder, Jesse (10 September 2012). "Dark Patterns in UI and Website Design". evatotuts+. Archived from the original on 26 December 2022. Retrieved 29 May 2017.
  11. 1 2 3 Brignull, Harry. "Types of Dark Patterns". Dark Patterns. Retrieved 29 May 2017.
  12. "UX Dark Patterns: Manipulinks and Confirmshaming". UX Booth. Retrieved 2 November 2019.
  13. "Terms of service for McAffee in μTorrent installer". 2017. Retrieved 13 October 2018.
  14. Brinkmann, Martin (17 July 2013). "SourceForge's new Installer bundles program downloads with adware" . Retrieved 13 October 2018. ... The offer is displayed on the screen, and below that a gray decline button, a green accept button ...
  15. "Why do we need email addresses to create Reddit accounts now?". 2017. Retrieved 13 October 2018. ... you can skip it by leaving it blank.
  16. Schlosser, Dan (5 June 2016). "LinkedIn Dark Patterns" . Retrieved 13 October 2018. ... you need to find the tiny "Skip this step" link at the bottom right to proceed. Moreover, the link is placed outside of the blue box which ostensibly contains all relevant info or controls. ...
  17. 1 2 Vincent, James (16 March 2021). "California bans 'dark patterns' that trick users into giving away their personal data". The Verge. Retrieved 21 March 2021.
  18. Brignull, Harry (29 August 2013). "Dark patterns: Inside the interfaces designed to trick you". The Verge. Retrieved 29 May 2017.
  19. Goldmacher, Shane (3 April 2021). "How Trump Steered Supporters Into Unwitting Donations". The New York Times . Archived from the original on 1 May 2021.
  20. 1 2 "FTC to Ramp up Enforcement against Illegal Dark Patterns that Trick or Trap Consumers into Subscriptions". Federal Trade Commission. 28 October 2021. Retrieved 13 December 2021.
  21. Bösch, Christoph; Erb, Benjamin; Kargl, Frank; Kopp, Henning; Pfattheicher, Stefan (1 October 2016). "Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns". Proceedings on Privacy Enhancing Technologies. 2016 (4): 237–254. doi: 10.1515/popets-2016-0038 . ISSN   2299-0984.
  22. Fritsch, Lothar (2017). Privacy dark patterns in identity management. Gesellschaft für Informatik, Bonn. ISBN   978-3-88579-671-8.
  23. Moen, Gro Mette, Ailo Krogh Ravna, and Finn Myrstad: Deceived by Design - How tech companies use dark patterns to discourage us from exercising our rights to privacy. Archived 11 October 2020 at the Wayback Machine , 2018, Consumer council of Norway / Forbrukerrådet. Report.
  24. Mathur, Arunesh; Acar, Gunes; Friedman, Michael J.; Lucherini, Elena; Mayer, Jonathan; Chetty, Marshini; Narayanan, Arvind (November 2019). "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites". Proceedings of the ACM on Human-Computer Interaction. 3 (CSCW): 81:1–81:32. arXiv: 1907.07032 . Bibcode:2019arXiv190707032M. doi:10.1145/3359183. ISSN   2573-0142. S2CID   196831872.
  25. Runge, Julian; Wentzel, Daniel; Huh, Ji Young; Chaney, Allison (14 April 2022). ""Dark patterns" in online services: a motivating study and agenda for future research". Marketing Letters. 34: 155–160. doi: 10.1007/s11002-022-09629-4 . ISSN   1573-059X. S2CID   248198573.
  26. Human, Soheil; Cech, Florian (2021). "A Human-Centric Perspective on Digital Consenting: The Case of GAFAM" (PDF). In Zimmermann, Alfred; Howlett, Robert J.; Jain, Lakhmi C. (eds.). Human Centred Intelligent Systems. Smart Innovation, Systems and Technologies. Vol. 189. Singapore: Springer. pp. 139–159. doi:10.1007/978-981-15-5784-2_12. ISBN   978-981-15-5784-2. S2CID   214699040.
  27. European Commission. Directorate General for Justice and Consumers (2022). Behavioural study on unfair commercial practices in the digital environment: dark patterns and manipulative personalisation : final report. LU: Publications Office. doi:10.2838/859030. ISBN   9789276523161.
  28. Tahaei, Mohammad; Vaniea, Kami (8 May 2021). "Developers Are Responsible": What Ad Networks Tell Developers About Privacy (PDF). pp. 1–11. doi:10.1145/3411763.3451805. ISBN   978-1-4503-8095-9. S2CID   233987185.
  29. Title 16 of the Code of Federal Regulations § 238
  30. Kelly, Makena (9 April 2019). "Big Tech's 'dark patterns' could be outlawed under new Senate bill". The Verge. Retrieved 10 April 2019.
  31. "Assurance of discontinuance" (PDF). March 2022.
  32. "Australia fines Expedia Group's Trivago $33 million on misleading hotel room rates". au.finance.yahoo.com. 22 April 2022. Retrieved 14 June 2022.
  33. "Fortnite Video Game Maker Epic Games to Pay More Than Half a Billion Dollars over FTC Allegations of Privacy Violations and Unwanted Charges". March 2023.
  34. "Understanding 'trust' and 'consent' are the real keys to embracing GDPR". The Drum. Retrieved 10 April 2019.
  35. "Facebook and Google hit with $8.8 billion in lawsuits on day one of GDPR". The Verge. Archived from the original on 25 May 2018. Retrieved 26 May 2018.
  36. "Max Schrems files first cases under GDPR against Facebook and Google". The Irish Times. Archived from the original on 25 May 2018. Retrieved 26 May 2018.
  37. "Facebook, Google face first GDPR complaints over 'forced consent'". TechCrunch. 25 May 2018. Archived from the original on 26 May 2018. Retrieved 26 May 2018.
  38. Meyer, David. "Google, Facebook hit with serious GDPR complaints: Others will be soon". ZDNet. Archived from the original on 28 May 2018. Retrieved 26 May 2018.
  39. "Guidelines 3/2022 on Dark patterns in social media platform interfaces: How to recognise and avoid them" (PDF). European Data Protection Board .
  40. Regulation (EU) 2023/2854 of the European Parliament and of the Council of 13 December 2023 on harmonised rules on fair access to and use of data and amending Regulation (EU) 2017/2394 and Directive (EU) 2020/1828 (Data Act), 13 December 2023, retrieved 10 January 2024
  41. Pál, Szilágyi (3 December 2023). "Consensus on the Data Act at the Council". Dark patterns, neuromarketing. Retrieved 10 January 2024.
  42. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act). OJ L 277, 27.10.2022, p. 1–102.
  43. Pál, Szilágyi (11 January 2024). "Dark patterns everywhere: ESMA". Dark patterns, neuromarketing. Retrieved 11 January 2024.
  44. "Under-18s face 'like' and 'streaks' limits". BBC News. 15 April 2019. Retrieved 15 April 2019.
  45. Lomas, Natasha (22 January 2020). "UK watchdog sets out 'age appropriate' design code for online services to keep kids' privacy safe". TechCrunch. Retrieved 9 April 2023.
  46. Lomas, Natasha (1 September 2021). "UK now expects compliance with children's privacy design code". TechCrunch. Retrieved 9 April 2023.