Privacy by design

Last updated

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. [1] [2] The privacy by design framework was published in 2009 [3] and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. [4] Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process. [5] [6]

Contents

Cavoukian's approach to privacy has been criticized as being vague, [7] challenging to enforce its adoption, [8] difficult to apply to certain disciplines, [9] [10] challenging to scale up to networked infrastructures, [10] as well as prioritizing corporate interests over consumers' interests [7] and placing insufficient emphasis on minimizing data collection. [9] Recent developments in computer science and data engineering, such as support for encoding privacy in data [11] and the availability and quality of Privacy-Enhancing Technologies (PET's) partly offset those critiques and help to make the principles feasible in real-world settings.

The European GDPR regulation incorporates privacy by design. [12]

History and background

The privacy by design framework was developed by Ann Cavoukian, Information and Privacy Commissioner of Ontario, following her joint work with the Dutch Data Protection Authority and the Netherlands Organisation for Applied Scientific Research in 1995. [1] [12] In 2009, the Information and Privacy Commissioner of Ontario co-hosted an event, Privacy by Design: The Definitive Workshop, with the Israeli Law, Information and Technology Authority at the 31st International Conference of Data Protection and Privacy Commissioner (2009). [13] [14]

In 2010 the framework achieved international acceptance when the International Assembly of Privacy Commissioners and Data Protection Authorities unanimously passed a resolution on privacy by design [15] recognising it as an international standard at their annual conference. [14] [16] [17] [4] Among other commitments, the commissioners resolved to promote privacy by design as widely as possible and foster the incorporation of the principle into policy and legislation. [4]

Foundational principles

Privacy by design is based on seven "foundational principles": [3] [18] [19] [20]

  1. Proactive not reactive; preventive not remedial [3] [18] [19] [20]
  2. Privacy as the default setting [3] [18] [19] [20]
  3. Privacy embedded into design [3] [18] [19] [20]
  4. Full functionality – positive-sum, not zero-sum [3] [18] [19] [20]
  5. End-to-end security – full lifecycle protection [3] [18] [19] [20]
  6. Visibility and transparency – keep it open [3] [18] [19] [20]
  7. Respect for user privacy – keep it user-centric [3] [18] [19] [20]

The principles have been cited in over five hundred articles [21] referring to the Privacy by Design in Law, Policy and Practice white paper by Ann Cavoukian. [22]

Principles in detail

Proactive not reactive; preventive not remedial

The privacy by design approach is characterized by proactive rather than reactive measures. It anticipates and prevents privacy invasive events before they happen. Privacy by design does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred — it aims to prevent them from occurring. In short, privacy by design comes before-the-fact, not after. [18] [19] [20]

Privacy as the default (PbD)

Privacy by design seeks to deliver the maximum degree of privacy by ensuring that personal data are automatically protected in any given IT system or business practice. If an individual does nothing, their privacy still remains intact. No action is required on the part of the individual to protect their privacy — it is built into the system, by default. [18] [19] [20]

PbD practices
  • Purpose Specification - The data subjects must be clearly communicated to at or before any data collection, retention, or usage occurs, and the purpose(s) must be limited and relevant to the stated needs. [18]
  • Collection Limitation - Collection of data must be fair, lawful, and limited to the stated purpose. [18]
  • Data minimization - Collection of data should be minimized as much as possible, and technologies should default to have users be non-identifiable and non-observable or minimized if absolutely necessary. [18]
  • Use, Retention, and Disclosure - Use, retention, and disclosure of data must be limited and only for what has been consented to, with exceptions by law. Information should only be retained for the stated amount time needed and then securely erased. [18]

Privacy embedded into design

Privacy by design is embedded into the design and architecture of IT systems as well as business practices. It is not bolted on as an add-on, after the fact. The result is that privacy becomes an essential component of the core functionality being delivered. Privacy is integral to the system without diminishing functionality. [18] [19] [20]

Full functionality – positive-sum, not zero-sum

Privacy by design seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through a dated, zero-sum approach, where unnecessary trade-offs are made. Privacy by design avoids the pretense of false dichotomies, such as privacy versus security, demonstrating that it is possible to have both. [18] [19] [20]

End-to-end security – full lifecycle protection

Privacy by design, having been embedded into the system prior to the first element of information being collected, extends securely throughout the entire lifecycle of the data involved — strong security measures are essential to privacy, from start to finish. This ensures that all data are securely retained, and then securely destroyed at the end of the process, in a timely fashion. Thus, privacy by design ensures cradle-to-grave, secure lifecycle management of information, end-to-end. [18] [19] [20]

Visibility and transparency – keep it open

Privacy by design seeks to assure all stakeholders that whatever business practice or technology involved is in fact operating according to the stated promises and objectives, subject to independent verification. The component parts and operations remain visible and transparent, to users and providers alike. Remember to trust but verify. [18] [19] [20]

Respect for user privacy – keep it user-centric

Above all, privacy by design requires architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice, and empowering user-friendly options. Keep it user-centric. [18] [19] [20]

Design and standards

The International Organization for Standardization (ISO) approved the Committee on Consumer Policy (COPOLCO) proposal for a new ISO standard: Consumer Protection: Privacy by Design for Consumer Goods and Services (ISO/PC317). [23] The standard will aim to specify the design process to provide consumer goods and services that meet consumers’ domestic processing privacy needs as well as the personal privacy requirements of data protection. The standard has the UK as secretariat with thirteen participating members [24] and twenty observing members. [24]

The Standards Council of Canada (SCC) is one of the participating members and has established a mirror Canadian committee to ISO/PC317. [25]

The OASIS Privacy by Design Documentation for Software Engineers (PbD-SE) [26] Technical Committee provides a specification to operationalize privacy by design in the context of software engineering. Privacy by design, like security by design, is a normal part of the software development process and a risk reduction strategy for software engineers. The PbD-SE specification translates the PbD principles to conformance requirements within software engineering tasks and helps software development teams to produce artifacts as evidence of PbD principle adherence. Following the specification facilitates the documentation of privacy requirements from software conception to retirement, thereby providing a plan around adherence to privacy by design principles, and other guidance to privacy best practices, such as NIST's 800-53 Appendix J (NIST SP 800–53) and the Fair Information Practice Principles (FIPPs) (PMRM-1.0). [26]

Relationship to privacy-enhancing technologies

Privacy by design originated from privacy-enhancing technologies (PETs) in a joint 1995 report by Ann Cavoukian and John Borking. [1] In 2007 the European Commission provided a memo on PETs. [27] In 2008 the British Information Commissioner's Office commissioned a report titled Privacy by Design – An Overview of Privacy Enhancing Technologies. [28]

There are many facets to privacy by design. There is the technical side like software and systems engineering, [29] administrative elements (e.g. legal, policy, procedural), other organizational controls, and operating contexts. Privacy by design evolved from early efforts to express fair information practice principles directly into the design and operation of information and communications technologies. [30] In his publication Privacy by Design: Delivering the Promises [2] Peter Hustinx acknowledges the key role played by Ann Cavoukian and John Borking, then Deputy Privacy Commissioners, in the joint 1995 publication Privacy-Enhancing Technologies: The Path to Anonymity. [1] This 1995 report focussed on exploring technologies that permit transactions to be conducted anonymously.

Privacy-enhancing technologies allow online users to protect the privacy of their Personally Identifiable Information (PII) provided to and handled by services or applications. Privacy by design evolved to consider the broader systems and processes in which PETs were embedded and operated. The U.S. Center for Democracy & Technology (CDT) in The Role of Privacy by Design in Protecting Consumer Privacy [31] distinguishes PET from privacy by design noting that “PETs are most useful for users who already understand online privacy risks. They are essential user empowerment tools, but they form only a single piece of a broader framework that should be considered when discussing how technology can be used in the service of protecting privacy.” [31]

Global usage

Germany released a statute (§ 3 Sec. 4 Teledienstedatenschutzgesetz [Teleservices Data Protection Act]) back in July 1997. [32] The new EU General Data Protection Regulation (GDPR) includes ‘data protection by design’ and ‘data protection by default’, [33] [34] [12] the second foundational principle of privacy by design. Canada's Privacy Commissioner included privacy by design in its report on Privacy, Trust and Innovation – Building Canada’s Digital Advantage. [35] [36] In 2012, U.S. Federal Trade Commission (FTC) recognized privacy by design as one of its three recommended practices for protecting online privacy in its report entitled Protecting Consumer Privacy in an Era of Rapid Change, [37] and the FTC included privacy by design as one of the key pillars in its Final Commissioner Report on Protecting Consumer Privacy. [38] In Australia, the Commissioner for Privacy and Data Protection for the State of Victoria (CPDP) has formally adopted privacy by design as a core policy to underpin information privacy management in the Victorian public sector. [39] The UK Information Commissioner's Office website highlights privacy by design [40] and data protection by design and default. [41] In October 2014, the Mauritius Declaration on the Internet of Things was made at the 36th International Conference of Data Protection and Privacy Commissioners and included privacy by design and default. [42] The Privacy Commissioner for Personal Data, Hong Kong held an educational conference on the importance of privacy by design. [43] [44]

In the private sector, Sidewalk Toronto commits to privacy by design principles; [45] Brendon Lynch, Chief Privacy Officer at Microsoft, wrote an article called Privacy by Design at Microsoft; [46] whilst Deloitte relates certifiably trustworthy to privacy by design. [47]

Criticism and recommendations

The privacy by design framework attracted academic debate, particularly following the 2010 International Data Commissioners resolution that provided criticism of privacy by design with suggestions by legal and engineering experts to better understand how to apply the framework into various contexts. [7] [9] [8]

Privacy by design has been critiqued as "vague" [7] and leaving "many open questions about their application when engineering systems." Suggestions have been made to instead start with and focus on minimizing data, which can be done through security engineering. [9]

In 2007, researchers at K.U. Leuven published Engineering Privacy by Design noting that “The design and implementation of privacy requirements in systems is a difficult problem and requires translation of complex social, legal and ethical concerns into systems requirements”. The principles of privacy by design "remain vague and leave many open questions about their application when engineering systems". The authors argue that "starting from data minimization is a necessary and foundational first step to engineer systems in line with the principles of privacy by design". The objective of their paper is to provide an "initial inquiry into the practice of privacy by design from an engineering perspective in order to contribute to the closing of the gap between policymakers’ and engineers’ understanding of privacy by design." [9] Extended peer consultations performed 10 years later in an EU project however confirmed persistent difficulties in translating legal principles into engineering requirements. This is partly a more structural problem due to the fact that legal principles are abstract, open-ended with different possible interpretations and exceptions, whereas engineering practices require unambiguous meanings and formal definitions of design concepts. [10]

In 2011, the Danish National It and Telecom Agency published a discussion paper in which they argued that privacy by design is a key goal for creating digital security models, by extending the concept to "Security by Design". The objective is to balance anonymity and surveillance by eliminating identification as much as possible. [48]

Another criticism is that current definitions of privacy by design do not address the methodological aspect of systems engineering, such as using decent system engineering methods, e.g. those which cover the complete system and data life cycle. [7] This problem is further exacerbated in the move to networked digital infrastructures initiatives such as the smart city or the Internet of Things. Whereas privacy by design has mainly been focused on the responsibilities of singular organisations for a certain technology, these initiatives often require the interoperability of many different technologies operated by different organisations. This requires a shift from organisational to infrastructural design. [10]

The concept of privacy by design also does not focus on the role of the actual data holder but on that of the system designer. This role is not known in privacy law, so the concept of privacy by design is not based on law. This, in turn, undermines the trust by data subjects, data holders and policy-makers. [7] Questions have been raised from science and technology studies of whether privacy by design will change the meaning and practice of rights through implementation in technologies, organizations, standards and infrastructures. [49] From a civil society perspective, some have even raised the possibility that a bad use of these design-based approaches can even lead to the danger of bluewashing. This refers to the minimal instrumental use by organizations of privacy design without adequate checks, in order to portray themselves as more privacy-friendly than is factually justified. [10]

It has also been pointed out that privacy by design is similar to voluntary compliance schemes in industries impacting the environment, and thus lacks the teeth necessary to be effective, and may differ per company. In addition, the evolutionary approach currently taken to the development of the concept will come at the cost of privacy infringements because evolution implies also letting unfit phenotypes (privacy-invading products) live until they are proven unfit. [7] Some critics have pointed out that certain business models are built around customer surveillance and data manipulation and therefore voluntary compliance is unlikely. [8]

In 2013, Rubinstein and Good used Google and Facebook privacy incidents to conduct a counterfactual analysis in order to identify lessons learned of value for regulators when recommending privacy by design. The first was that “more detailed principles and specific examples” would be more helpful to companies. The second is that “usability is just as important as engineering principles and practices”. The third is that there needs to be more work on “refining and elaborating on design principles–both in privacy engineering and usability design”. including efforts to define international privacy standards. The final lesson learned is that “regulators must do more than merely recommend the adoption and implementation of privacy by design.” [8]

The advent of GDPR with its maximum fine of 4% of global turnover now provides a balance between business benefit and turnover and addresses the voluntary compliance criticism and requirement from Rubinstein and Good that “regulators must do more than merely recommend the adoption and implementation of privacy by design”. [8] Rubinstein and Good also highlighted that privacy by design could result in applications that exemplified Privacy by Design and their work was well received. [50] [8]

The May 2018 European Data Protection Supervisor Giovanni Buttarelli's paper Preliminary Opinion on Privacy by Design states, "While privacy by design has made significant progress in legal, technological and conceptual development, it is still far from unfolding its full potential for the protection of the fundamental rights of individuals. The following sections of this opinion provide an overview of relevant developments and recommend further efforts". [12]

The executive summary makes the following recommendations to EU institutions:

The EDPS will:

Implementing privacy by design

The European Data Protection Supervisor Giovanni Buttarelli set out the requirement to implement privacy by design in his article. [51] The European Union Agency for Network and Information Security (ENISA) provided a detailed report Privacy and Data Protection by Design – From Policy to Engineering on implementation. [52] The Summer School on real-world crypto and privacy provided a tutorial on "Engineering Privacy by Design". [53] The OWASP Top 10 Privacy Risks Project for web applications that gives hints on how to implement privacy by design in practice. The OASIS Privacy by Design Documentation for Software Engineers (PbD-SE) [26] offers a privacy extension/complement to OMG's Unified Modeling Language (UML) and serves as a complement to OASIS’ eXtensible Access Control Mark-up Language (XACML) and Privacy Management Reference Model (PMRM). Privacy by Design guidelines are developed to operationalise some of the high-level privacy-preserving ideas into more granular actionable advice., [54] [55] such as recommendations on how to implement privacy by design into existing (data) systems. However, still the applications of privacy by design guidelines by software developers remains a challenge. [56]

See also

Related Research Articles

Consumer privacy is information privacy as it relates to the consumers of products and services.

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

<span class="mw-page-title-main">Information Commissioner's Office</span> Non-departmental public body

The Information Commissioner's Office (ICO) is a non-departmental public body which reports directly to the Parliament of the United Kingdom and is sponsored by the Department for Science, Innovation and Technology. It is the independent regulatory office dealing with the Data Protection Act 2018 and the General Data Protection Regulation, the Privacy and Electronic Communications Regulations 2003 across the UK; and the Freedom of Information Act 2000 and the Environmental Information Regulations 2004 in England, Wales and Northern Ireland and, to a limited extent, in Scotland. When they audit an organisation they use Symbiant's audit software.

Center for Democracy & Technology (CDT) is a Washington, D.C.–based 501(c)(3) nonprofit organisation that advocates for digital rights and freedom of expression. CDT seeks to promote legislation that enables individuals to use the internet for purposes of well-intent, while at the same time reducing its potential for harm. It advocates for transparency, accountability, and limiting the collection of personal information.

Data security means protecting digital data, such as those in a database, from destructive forces and from the unwanted actions of unauthorized users, such as a cyberattack or a data breach.

A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.

The Platform for Privacy Preferences Project (P3P) is an obsolete protocol allowing websites to declare their intended use of information they collect about web browser users. Designed to give users more control of their personal information when browsing, P3P was developed by the World Wide Web Consortium (W3C) and officially recommended on April 16, 2002. Development ceased shortly thereafter and there have been very few implementations of P3P. Internet Explorer and Microsoft Edge were the only major browsers to support P3P. Microsoft has ended support from Windows 10 onwards. Internet Explorer and Edge on Windows 10 no longer support P3P. The president of TRUSTe has stated that P3P has not been implemented widely due to the difficulty and lack of value.

Personal data, also known as personal information or personally identifiable information (PII), is any information related to an identifiable person.

<span class="mw-page-title-main">Ann Cavoukian</span> Canadian data privacy researcher and former Ontario civil servant (born 1952)

Ann Cavoukian is the former Information and Privacy Commissioner for the Canadian province of Ontario. Her concept of privacy by design, which takes privacy into account throughout the system engineering process, was expanded on, as part of a joint Canadian-Dutch team, both before and during her tenure as commissioner of Ontario.

Information privacy, data privacy or data protection laws provide a legal framework on how to obtain, use and store data of natural persons. The various laws around the world describe the rights of natural persons to control who is using its data. This includes usually the right to get details on which data is stored, for what purpose and to request the deletion in case the purpose is not given anymore.

Information security standards or cyber security standards are techniques generally outlined in published materials that attempt to protect the cyber environment of a user or organization. This environment includes users themselves, networks, devices, all software, processes, information in storage or transit, applications, services, and systems that can be connected directly or indirectly to networks.

A cybersecurity regulation comprises directives that safeguard information technology and computer systems with the purpose of forcing companies and organizations to protect their systems and information from cyberattacks like viruses, worms, Trojan horses, phishing, denial of service (DOS) attacks, unauthorized access and control system attacks. There are numerous measures available to prevent cyberattacks.

Privacy software, also called privacy platform, is software built to protect the privacy of its users. The software typically works in conjunction with Internet usage to control or limit the amount of information made available to third parties. The software can apply encryption or filtering of various kinds.

Privacy-enhancing technologies (PET) are technologies that embody fundamental data protection principles by minimizing personal data use, maximizing data security, and empowering individuals. PETs allow online users to protect the privacy of their personally identifiable information (PII), which is often provided to and handled by services or applications. PETs use techniques to minimize an information system's possession of personal data without losing functionality. Generally speaking, PETs can be categorized as hard and soft privacy technologies.

Data portability is a concept to protect users from having their data stored in "silos" or "walled gardens" that are incompatible with one another, i.e. closed platforms, thus subjecting them to vendor lock-in and making the creation of data backups or moving accounts between services difficult.

Cloud computing security or, more simply, cloud security, refers to a broad set of policies, technologies, applications, and controls utilized to protect virtualized IP, data, applications, services, and the associated infrastructure of cloud computing. It is a sub-domain of computer security, network security, and, more broadly, information security.

<span class="mw-page-title-main">General Data Protection Regulation</span> EU regulation on the processing of personal data

The General Data Protection Regulation is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.

ISO/IEC 27001 is an international standard to manage information security. The standard was originally published jointly by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) in 2005, revised in 2013, and again most recently in 2022. There are also numerous recognized national variants of the standard. It details requirements for establishing, implementing, maintaining and continually improving an information security management system (ISMS) – the aim of which is to help organizations make the information assets they hold more secure. Organizations that meet the standard's requirements can choose to be certified by an accredited certification body following successful completion of an audit. The effectiveness of the ISO/IEC 27001 certification process and the overall standard has been addressed in a large-scale study conducted in 2020.

Privacy engineering is an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy.

A dark pattern is "a user interface that has been carefully crafted to trick users into doing things, such as buying overpriced insurance with their purchase or signing up for recurring bills". User experience designer Harry Brignull coined the neologism on 28 July 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces".

References

  1. 1 2 3 4 Hes, R. "Privacy Enhancing Technologies: the path to anonymity" (PDF).
  2. 1 2 Hustinx, Peter (2010). "Privacy by Design: Delivering the Promises". Identity in the Information Society. 3 (2): 253–255. doi: 10.1007/s12394-010-0061-z .
  3. 1 2 3 4 5 6 7 8 9 Cavoukian, Ann. "7 Foundational Principles" (PDF).
  4. 1 2 3 "32nd International Conference of Data Protection and Privacy Commissioners Jerusalem, Israel 27-29 October, 2010 Resolution on Privacy by Design" (PDF).
  5. Xu, Heng; Crossler, Robert E.; Bélanger, France (2012-12-01). "A Value Sensitive Design Investigation of Privacy Enhancing Tools in Web Browsers". Decision Support Systems. 54 (1): 424–433. doi:10.1016/j.dss.2012.06.003. ISSN   0167-9236. S2CID   14780230.
  6. Cavoukian, Ann (2011). "Privacy by Design" (PDF). Information and Privacy Commissioner.
  7. 1 2 3 4 5 6 7 van Rest, Jeroen (2014). "Designing Privacy-by-Design". Privacy Technologies and Policy. Lecture Notes in Computer Science. Vol. 8319. pp. 55–72. doi:10.1007/978-3-642-54069-1_4. ISBN   978-3-642-54068-4.
  8. 1 2 3 4 5 6 Rubinstein, Ira (2012-08-11). "Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents". Ira Rubinstein and Nathan Good. SSRN   2128146.
  9. 1 2 3 4 5 "Engineering Privacy by Design" (PDF). Seda Gurses, Carmela Troncoso, and Claudia Diaz.
  10. 1 2 3 4 5 van Dijk, Niels; Tanas, Alessia; Rommetveit, Kjetil; Raab, Charles (2018-04-10). "Right engineering? The redesign of privacy and personal data protection". International Review of Law Computers & Technology. 32 (2): 230–256. doi:10.1080/13600869.2018.1457002. hdl: 20.500.11820/fc11577d-3520-4ae4-abfd-3d767aeac906 . S2CID   65276552.
  11. "Toward Privacy by Design for Data" (PDF). IEEE Data Engineering Bulletin, Special issue on the system implications of GDPR. Retrieved 2022-07-29.
  12. 1 2 3 4 5 "Preliminary Opinion on privacy by design" (PDF). Giovanni Buttarelli.
  13. "Privacy Conference 2009 Fifth Plenary Session – Privacy by Design".
  14. 1 2 "Report on the State of PbD to the 33rd International Conference of Data Protection and Privacy Commissioners" (PDF).
  15. Cavoukian, Ann (2010). "Privacy by Design: the definitive workshop. A foreword by Ann Cavoukian, Ph.D" (PDF). Identity in the Information Society. 3 (2): 247–251. doi: 10.1007/s12394-010-0062-y . S2CID   144133793.
  16. "'Privacy by Design' approach gains international recognition". 2010-11-04.
  17. "Landmark Resolution passed to preserve the Future of Privacy". Archived from the original on 2010-11-08.
  18. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 Cavoukian, Ann (January 2011). "The 7 Foundational Principles Implementation and Mapping of Fair Information Practices" (PDF). Information and Privacy Commissioner of Ontario. Archived from the original (PDF) on 2022-10-20.
  19. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Cavoukian, Ann. "Privacy by Design – Primer" (PDF).
  20. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Cavoukian, Ann. "Privacy by Design – The 7 Foundational Principles" (PDF). Privacy and Big Data Institute.
  21. "Citations for Privacy by Design in Law, Policy and Practice". Google Scholar.
  22. Cavoukian, Ann. "Privacy by Design in Law, Policy and Practice – A White Paper for Regulators, Decision-makers and Policy-makers" (PDF).
  23. "ISO/PC 317 - Consumer protection: privacy by design for consumer goods and services". 2018-05-11.
  24. 1 2 "ISO/PC 317 - Participating Members".
  25. "SCC ISO/PC 317 - Consumer protection: privacy by design for consumer goods and services". 2018-04-09.
  26. 1 2 3 "OASIS Privacy by Design Documentation for Software Engineers (PbD-SE) TC".
  27. "Privacy Enhancing Technologies (PETs)".
  28. "Privacy by Design – An Overview of Privacy Enhancing Technologies" (PDF).
  29. Danezis, George; Domingo-Ferrer, Josep; Hansen, Marit; Hoepman, Jaap-Henk; Le Metayer, Daniel; Tirtea, Rodica; Schiffner, Stefan (2015). "privacy and data protection by design from policy to engineering". ENISA. arXiv: 1501.03726 . doi:10.2824/38623. ISBN   9789292041083. S2CID   7917275.
  30. Cavoukian, Ann. "Privacy by Design: Origins, Meaning, and Prospects for Assuring Privacy and Trust in the Information Era)".
  31. 1 2 "The Role of Privacy by Design in Protecting Consumer Privacy".
  32. "Bundesgesetzblatt".
  33. "Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)". European Commissioner (January 2012).
  34. "European Commission - Fact Sheet Questions and Answers – General Data Protection Regulation".
  35. "Privacy, Trust and Innovation – Building Canada's Digital Advantage". 2010.
  36. "Towards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. Report of the Standing Committee on Access to Information, Privacy and Ethics" (PDF).
  37. "Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for businesses and policy-makers" (PDF). FTC Report (March 2012).
  38. "FTC Issues Final Commission Report on Protecting Consumer Privacy". 2012-03-26.
  39. "Office of the Victorian Information Commissioner - Privacy by Design".
  40. "UK ICO - Privacy by Design". Archived from the original on 2018-05-24.
  41. "UK ICO - Data protection by design and default". 2018-11-23.
  42. "Mauritius Declaration on the Internet of Things" (PDF).
  43. "About the Privacy by Design Conference".
  44. "Privacy Commissioner for Personal Data – Privacy by Design".
  45. "Sidewalk Toronto commits to privacy by design principles amid citizen concerns". 2018-05-07.
  46. "Privacy by Design at Microsoft". 2010-11-30.
  47. "Ryerson, Deloitte partner to offer privacy certifications".
  48. "New Digital Security Models" (PDF). Danish National It and Telecom Agency.
  49. Rommetveit, Kjetil; Van Dijk, Niels (2022). "Privacy Engineering and the Techno-regulatory Imaginary". Social Studies of Science. 52 (Online first): 853–877. doi:10.1177/03063127221119424. PMC   9676411 . PMID   36000578. S2CID   251767267.
  50. "Why 'Privacy By Design' Is The New Corporate Hotness". Kashmir Hill.
  51. "Privacy by Design - Privacy Engineering" (PDF). Giovanni Buttarelli.
  52. "Privacy and Data Protection by Design – from policy to engineering". ENISA.
  53. "Engineering privacy by design" (PDF).
  54. Perera, Charith; Barhamgi, Mahmoud; Bandara, Arosha K.; Ajmal, Muhammad; Price, Blaine; Nuseibeh, Bashar (February 2020). "Designing privacy-aware internet of things applications". Information Sciences. 512: 238–257. arXiv: 1703.03892 . doi:10.1016/j.ins.2019.09.061. S2CID   60044.
  55. "Implementing Privacy By Design". Privacy Policies. Retrieved 2020-12-13.
  56. Tahaei, Mohammad; Li, Tianshi; Vaniea, Kami (2022-04-01). "Understanding Privacy-Related Advice on Stack Overflow". Proceedings on Privacy Enhancing Technologies. 2022 (2): 114–131. doi: 10.2478/popets-2022-0038 . hdl: 1983/07daf61b-7df0-48c1-88e1-85079294a59c .