Personal data

Last updated

Personal data, also known as personal information or personally identifiable information (PII), [1] [2] [3] is any information related to an identifiable person.

Contents

The abbreviation PII is widely used in the United States, but the phrase it abbreviates has four common variants based on personal or personally, and identifiable or identifying. Not all are equivalent, and for legal purposes the effective definitions vary depending on the jurisdiction and the purposes for which the term is being used. [lower-alpha 1] Under European Union and United Kingdom data protection regimes, which centre primarily on the General Data Protection Regulation (GDPR), [4] the term "personal data" is significantly broader, and determines the scope of the regulatory regime. [5]

National Institute of Standards and Technology Special Publication 800-122 [6] defines personally identifiable information as "any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual's identity, such as name, social security number, date and place of birth, mother's maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information." For instance, a user's IP address is not classed as PII on its own, but is classified as a linked PII. [7]

Personal data is defined under the GDPR as "any information which [is] related to an identified or identifiable natural person". [8] [6] The IP address of an Internet subscriber may be classed as personal data. [9]

The concept of PII has become prevalent as information technology and the Internet have made it easier to collect PII leading to a profitable market in collecting and reselling PII. PII can also be exploited by criminals to stalk or steal the identity of a person, or to aid in the planning of criminal acts. As a response to these threats, many website privacy policies specifically address the gathering of PII, [10] and lawmakers such as the European Parliament have enacted a series of legislation such as the GDPR to limit the distribution and accessibility of PII. [11]

Important confusion arises around whether PII means information which is identifiable (that is, can be associated with a person) or identifying (that is, associated uniquely with a person, such that the PII identifies them). In prescriptive data privacy regimes such as the US federal Health Insurance Portability and Accountability Act (HIPAA), PII items have been specifically defined. In broader data protection regimes such as the GDPR, personal data is defined in a non-prescriptive principles-based way. Information that might not count as PII under HIPAA can be personal data for the purposes of GDPR. For this reason, "PII" is typically deprecated internationally.

Definitions

The U.S. government used the term "personally identifiable" in 2007 in a memorandum from the Executive Office of the President, Office of Management and Budget (OMB), [12] and that usage now appears in US standards such as the NIST Guide to Protecting the Confidentiality of Personally Identifiable Information (SP 800-122). [13] The OMB memorandum defines PII as follows:

Information that can be used to distinguish or trace an individual's identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or recognizing linked or linkable information, such as date and place of birth, as well as the mother's maiden name, in official standards like the NIST Guide, demonstrates a proactive approach to ensuring robust privacy safeguards amid the dynamic landscape of data security. This integration into established standards is a foundational framework for organizations to adopt and implement effective measures in safeguarding individuals' personal information.

NIST, NIST Privacy Framework, https://www.nist.gov/privacy-framework

A term similar to PII, "personal data", is defined in EU directive 95/46/EC, for the purposes of the directive: [14]

Article 2a: 'personal data' shall mean any information relating to an identified or identifiable natural person ('data subject'); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;

In the EU rules, there has been a more specific notion that the data subject can potentially be identified through additional processing of other attributes—quasi- or pseudo-identifiers. In the GDPR, personal data is defined as:

Any information relating to an identified or identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person [15]

A simple example of this distinction: the color name "red" by itself is not personal data, but that same value stored as part of a person's record as their "favorite color" is personal data; it is the connection to the person that makes it personal data, not (as in PII) the value itself.

Another term similar to PII, "personal information", is defined in a section of the California data breach notification law, SB1386: [16]

(e) For purposes of this section, "personal information" means an individual's first name or first initial and last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted: (1) Social security number. (2) Driver's license number or California Identification Card number. (3) Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual's financial account. (f) For purposes of this section, "personal information" does not include publicly available information that is lawfully made available to the general public from federal, state, or local government records.

The concept of information combination given in the SB1386 definition is key to correctly distinguishing PII, as defined by OMB, from "personal information", as defined by SB1386. Information, such as a name, that lacks context cannot be said to be SB1386 "personal information", but it must be said to be PII as defined by OMB. For example, the name "John Smith" has no meaning in the current context and is therefore not SB1386 "personal information", but it is PII. A Social Security Number (SSN) without a name or some other associated identity or context information is not SB1386 "personal information", but it is PII. For example, the SSN 078-05-1120 by itself is PII, but it is not SB1386 "personal information". However the combination of a valid name with the correct SSN is SB1386 "personal information". [16]

The combination of a name with a context may also be considered PII; for example, if a person's name is on a list of patients for an HIV clinic. However, it is not necessary for the name to be combined with a context in order for it to be PII. The reason for this distinction is that bits of information such as names, although they may not be sufficient by themselves to make an identification, may later be combined with other information to identify persons and expose them to harm.

The scope of the term "sensitive personal data" varies by jurisdiction. In the UK, personal health data is treated as "sensitive" and in need of additional data protection measures. [17] According to the OMB, in the United States it is not always the case that PII is "sensitive", and context may be taken into account in deciding whether certain PII is or is not sensitive. [12] [ full citation needed ]

When a person wishes to remain anonymous, descriptions of them will often employ several of the above, such as "a 34-year-old white male who works at Target". Information can still be private, in the sense that a person may not wish for it to become publicly known, without being personally identifiable. Moreover, sometimes multiple pieces of information, none sufficient by itself to uniquely identify an individual, may uniquely identify a person when combined; this is one reason that multiple pieces of evidence are usually presented at criminal trials. It has been shown that, in 1990, 87% of the population of the United States could be uniquely identified by gender, ZIP code, and full date of birth. [18]

In hacker and Internet slang, the practice of finding and releasing such information is called "doxing". [19] [20] It is sometimes used to deter collaboration with law enforcement. [21] On occasion, the doxing can trigger an arrest, particularly if law enforcement agencies suspect that the "doxed" individual may panic and disappear. [22]

Laws and standards

Australia

In Australia, the Privacy Act 1988 deals with the protection of individual privacy, using the OECD Privacy Principles from the 1980s to set up a broad, principles-based regulatory model (unlike in the US, where coverage is generally not based on broad principles but on specific technologies, business practices or data items). Section 6 has the relevant definition. [23] The critical detail is that the definition of 'personal information' also applies to where the individual can be indirectly identified:

"personal information" means information or an opinion about an identified individual, or an individual who is reasonably identifiable whether the information or opinion is true or not; and whether the information or opinion is recorded in a material form or not.

It appears that this definition is significantly broader than the Californian example given above, and thus that Australian privacy law may cover a broader category of data and information than in some US law.

In particular, online behavioral advertising businesses based in the US but surreptitiously collecting information from people in other countries in the form of cookies, bugs, trackers and the like may find that their preference to avoid the implications of wanting to build a psychographic profile of a particular person using the rubric of 'we don't collect personal information' may find that this does not make sense under a broader definition like that in the Australian Privacy Act.

The term "PII" is not used in Australian privacy law.

Canada

European Union

European Union data protection law does not use the concept of personally identifiable information, and its scope is instead determined by non-synonymous, wider concept of "personal data".

Further examples can be found on the EU privacy website. [24]

Hong Kong

On 1 June 2023, the Hong Kong Office of the Privacy Commissioner for Personal Data published an investigation report on a data breach involving the unauthorised access of a credit reference database platform. The Report highlights the need for organizations to take adequate steps to protect personal data as the mere imposition of contractual obligations and policies is insufficient if such obligations and policies are not effective or are not enforced. The Report also clarifies that credit data is a form of "sensitive" personal data. [25]

United Kingdom

New Zealand

The twelve Information Privacy Principles of the Privacy Act 1993 apply. New Zealand enacted the Privacy Act in 2020 to promote and protect individual privacy. [28]

Switzerland

The Federal Act on Data Protection of 19 June 1992 (in force since 1993) has set up a protection of privacy by prohibiting virtually any processing of personal data which is not expressly authorized by the data subjects. [29] The protection is subject to the authority of the Federal Data Protection and Information Commissioner. [29]

Additionally, any person may ask in writing a company (managing data files) the correction or deletion of any personal data. [30] The company must respond within thirty days. [30]

United States

The Privacy Act of 1974 (Pub.L. 93–579, 88 Stat. 1896, enacted 31 December 1974, 5 U.S.C.   § 552a, a United States federal law, establishes a Code of Fair Information Practice that governs the collection, maintenance, use, and dissemination of personally identifiable information about individuals that is maintained in systems of records by federal agencies. [31]

One of the primary focuses of the Health Insurance Portability and Accountability Act (HIPAA) is to protect a patient's Protected Health Information (PHI), which is similar to PII. The U.S. Senate proposed the Privacy Act of 2005, which attempted to strictly limit the display, purchase, or sale of PII without the person's consent. Similarly, the (proposed) Anti-Phishing Act of 2005 attempted to prevent the acquiring of PII through phishing.

U.S. lawmakers have paid special attention to the social security number because it can be easily used to commit identity theft. The (proposed) Social Security Number Protection Act of 2005 and (proposed) Identity Theft Prevention Act of 2005 each sought to limit the distribution of an individual's social security number.

Additional U.S.-specific personally identifiable information [32] includes, but is not limited to, I-94 records, Medicaid ID numbers, and Internal Revenue Service (I.R.S.) documentation. Exclusivity of personally identifiable information affiliated with the U.S. highlights national data security concerns [33] and the influence of personally identifiable information in U.S. federal data management systems.

State laws and significant court rulings

  • California
    • The California state constitution declares privacy an inalienable right in Article 1, Section 1.
    • Online Privacy Protection Act (OPPA) of 2003
    • SB 1386 requires organizations to notify individuals when PII (in combination with one or more additional, specific data elements) is known or believed to be acquired by an unauthorized person.
    • In 2011, the California State Supreme Court ruled that a person's ZIP code is PII. [34]
  • Nevada
    • Nevada Revised Statutes 603A – Security of Personal Information [35]
  • Massachusetts
    • 201 CMR 17.00: Standards for the Protection of Personal Information of Residents of the Commonwealth [36]
    • In 2013, the Massachusetts Supreme Court ruled that ZIP codes are PII. [37]

Federal law

NIST definition

The National Institute of Standards and Technology (NIST) is a physical sciences laboratory, and a non-regulatory agency of the United States Department of Commerce. Its mission is to promote innovation and industrial competitiveness.

The following data, often used for the express purpose of distinguishing individual identity, clearly classify as personally identifiable information under the definition used by the NIST (described in detail below): [13]

The following are less often used to distinguish individual identity, because they are traits shared by many people. However, they are potentially PII, because they may be combined with other personal information to identify an individual.

Forensics

In forensics, particularly the identification and prosecution of criminals, personally identifiable information is critical in establishing evidence in criminal procedure. Criminals may go to great trouble to avoid leaving any PII,[ citation needed ] such as by:

Personal safety

Personal data is a key component of online identity and can be exploited by individuals. For instance, data can be altered and used to create fake documents, hijack mail boxes and phone calls or harass people, as occurred in 2019 to a customer of the EE mobile phone operator in the UK. [43]

Another category can be referred to as financial identity theft, [44] which usually entails bank account and credit card information being stolen, and then being used or sold. [45]

Personal data can also be used to create fake online identities, including fake accounts and profiles (which can be referred as identity cloning [46] or identity fraud) for celebrities to gather data from other users more easily. [47] Even individuals can be concerned, especially for personal purposes (this is more widely known as sockpuppetry).

The most critical information, such as one's password, date of birth, ID documents or social security number, can be used to log in to different websites (e.g. password reuse and account verification) to gather more information and access more content.

Also, several agencies ask for discretion on subjects related to their work, for the safety of their employees. For this reason, the United States Department of Defense (DoD) has strict policies controlling release of personally identifiable information of DoD personnel. [48] Many intelligence agencies have similar policies, sometimes to the point where employees do not disclose to their friends that they work for the agency.

Similar identity protection concerns exist for witness protection programs, women's shelters, and victims of domestic violence and other threats. [49]

Personal information removal

Personal information removal services work by identifying and requesting data brokers to delete the personal information of their clients. This process can be manual or fully automated, but it is nevertheless complex because it involves dealing with numerous data brokers, each with different policies and procedures for data removal. [50] [51] [52]

Companies offering personal information removal also face some issues. They struggle to ensure comprehensive data removal as new data brokers emerge and existing ones don’t always comply with removal requests. Most of them are also limited to certain regions or countries. [53]

Trade of personal data

During the second half of the 20th century, the digital revolution introduced "privacy economics", or the trade of personal data. The value of data can change over time and over different contexts. Disclosing data can reverse information asymmetry, though the costs of doing so can be unclear. In relation to companies, consumers often have "imperfect information regarding when their data is collected, with what purposes, and with what consequences". [54]

Writing in 2015, Alessandro Acquisti, Curtis Taylor and Liad Wagman identified three "waves" in the trade of personal data:

  1. In the 1970s, the Chicago Boys school claimed that protection of privacy could have a negative impact on the market because it could lead to incorrect and non-optimal decisions. Other researchers like Andrew F. Daughety and Jennifer F. Reinganum suggested that the opposite was true, and that absence of privacy would also lead to this. [55]
  2. In the mid-1990s, Varian retook the Chicago Boys approach and added a new externality, stating that the consumer would not always have perfect information on how their own data would be used. [56] Kenneth C. Laudon developed a model in which individuals own their data and have the ability to sell it as a product. He believed that such a system should not be regulated, to create a free market. [57]
  3. In the 2000s, researchers worked on price discrimination (Taylor, 2004 [58] ), two-sided markets (Cornière, 2011 [59] ) and marketing strategies (Anderson and de Palma, 2012 [60] ). The theories became complex, and showed that the impact of privacy on the economy highly depended on the context.[ clarification needed ]

Data brokers

A data broker is an individual or company that specializes in collecting personal data (such as income, ethnicity, political beliefs, or geolocation data) or data about people, mostly from public records but sometimes sourced privately, and selling or licensing such information to third parties for a variety of uses. Sources, usually Internet-based since the 1990s, may include census and electoral roll records, social networking sites, court reports and purchase histories. The information from data brokers may be used in background checks used by employers and housing.

There are varying regulations around the world limiting the collection of information on individuals; privacy laws vary. In the United States there is no federal regulation protection for the consumer from data brokers, although some states have begun enacting laws individually. In the European Union, GDPR serves to regulate data brokers' operations. Some data brokers report to have large numbers of population data or "data attributes". Acxiom purports to have data from 2.5 billion different people.

See also

Notes

  1. In other countries with privacy protection laws derived from the OECD privacy principles, the term used is more often "personal information", which may be somewhat broader: in Australia's Privacy Act 1988 (Cth) "personal information" also includes information from which the person's identity is "reasonably ascertainable", potentially covering some information not covered by PII.

Related Research Articles

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

<span class="mw-page-title-main">Data Protection Directive</span> EU directive on the processing of personal data

The Data Protection Directive, officially Directive 95/46/EC, enacted in October 1995, was a European Union directive which regulated the processing of personal data within the European Union (EU) and the free movement of such data. The Data Protection Directive was an important component of EU privacy and human rights law.

<span class="mw-page-title-main">Data Protection Act 1998</span> United Kingdom legislation

The Data Protection Act 1998 (DPA) was an Act of Parliament of the United Kingdom designed to protect personal data stored on computers or in an organised paper filing system. It enacted provisions from the European Union (EU) Data Protection Directive 1995 on the protection, processing, and movement of data.

Information privacy, data privacy or data protection laws provide a legal framework on how to obtain, use and store data of natural persons. The various laws around the world describe the rights of natural persons to control who is using its data. This includes usually the right to get details on which data is stored, for what purpose and to request the deletion in case the purpose is not given anymore.

A cybersecurity regulation comprises directives that safeguard information technology and computer systems with the purpose of forcing companies and organizations to protect their systems and information from cyberattacks like viruses, worms, Trojan horses, phishing, denial of service (DOS) attacks, unauthorized access and control system attacks. While cybersecurity regulations aim to minimize cyber risks and enhance protection, the uncertainty arising from frequent changes or new regulations can significantly impact organizational response strategies.

Privacy law is a set of regulations that govern the collection, storage, and utilization of personal information from healthcare, governments, companies, public or private entities, or individuals.

<span class="mw-page-title-main">Information sensitivity</span> Classification of secrecy of information

Information sensitivity is the control of access to information or knowledge that might result in loss of an advantage or level of security if disclosed to others. Loss, misuse, modification, or unauthorized access to sensitive information can adversely affect the privacy or welfare of an individual, trade secrets of a business or even the security and international relations of a nation depending on the level of sensitivity and nature of the information.

Pseudonymization is a data management and de-identification procedure by which personally identifiable information fields within a data record are replaced by one or more artificial identifiers, or pseudonyms. A single pseudonym for each replaced field or collection of replaced fields makes the data record less identifiable while remaining suitable for data analysis and data processing.

Personal Identifiers (PID) are a subset of personally identifiable information (PII) data elements, which identify an individual and can permit another person to "assume" that individual's identity without their knowledge or consent. PIIs include direct identifiers and indirect identifiers.

Security breach notification laws or data breach notification laws are laws that require individuals or entities affected by a data breach, unauthorized access to data, to notify their customers and other parties about the breach, as well as take specific steps to remedy the situation based on state legislature. Data breach notification laws have two main goals. The first goal is to allow individuals a chance to mitigate risks against data breaches. The second goal is to promote company incentive to strengthen data security.Together, these goals work to minimize consumer harm from data breaches, including impersonation, fraud, and identity theft.

A Personal Information Agent (PIA) is an individual, business, or organization who is expressly authorized by another identifiable individual in dealings with third persons, businesses or organizations concerning personally identifiable information (PII). PIA status allows access to information pertaining to an identifiable individual and the records and associated files of that identifiable individual. This normally includes, but is not limited to, financial files, correspondence, memorandum, machine-readable records and any other documentary material, regardless of physical form or characteristics. Access of these records extends to any copy of any of those things, pertaining to that identifiable individual and including the right to audit and monitor activities that involve the process for notification and reporting of unauthorized disclosure or PII breaches.

The Personal Data Privacy and Security Act of 2009, was a bill proposed in the United States Congress to increase protection of personally identifiable information by private companies and government agencies, set guidelines and restrictions on personal data sharing by data brokers, and to enhance criminal penalty for identity theft and other violations of data privacy and security. The bill was sponsored in the United States Senate by Patrick Leahy (Democrat-Vermont), where it is known as S.1490.

The German Bundesdatenschutzgesetz (BDSG) is a federal data protection act, that together with the data protection acts of the German federated states and other area-specific regulations, governs the exposure of personal data, which are manually processed or stored in IT systems.

Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.

<span class="mw-page-title-main">General Data Protection Regulation</span> EU regulation on the processing of personal data

The General Data Protection Regulation, abbreviated GDPR, or French RGPD is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.

<span class="mw-page-title-main">Chris Hoofnagle</span>

Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.

Privacy engineering is an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy. Its focus lies in organizing and assessing methods to identify and tackle privacy concerns within the engineering of information systems.

Data re-identification or de-anonymization is the practice of matching anonymous data with publicly available information, or auxiliary data, in order to discover the person to whom the data belongs. This is a concern because companies with privacy policies, health care providers, and financial institutions may release the data they collect after the data has gone through the de-identification process.

The gathering of personally identifiable information (PII) refers to the collection of public and private personal data that can be used to identify individuals for various purposes, both legal and illegal. PII gathering is often seen as a privacy threat by data owners, while entities such as technology companies, governments, and organizations utilize this data to analyze consumer behavior, political preferences, and personal interests.

<span class="mw-page-title-main">Personal Information Protection Law of the People's Republic of China</span> Chinese personal information rights law

The Personal Information Protection Law of the People's Republic of China referred to as the Personal Information Protection Law or ("PIPL") protecting personal information rights and interests, standardize personal information handling activities, and promote the rational use of personal information. It also addresses the transfer of personal data outside of China.

References

  1. "Management of Data Breaches Involving Sensitive Personal Information (SPI)". VA.gov. Washington, DC: Department of Veterans Affairs. 6 January 2012. Archived from the original on 26 May 2015. Retrieved 25 May 2015.
  2. Stevens, Gina (10 April 2012). "Data Security Breach Notification Laws" (PDF). fas.org. Retrieved 8 June 2017.
  3. Greene, Sari Stern (2014). Security Program and Policies: Principles and Practices. Indianapolis, IN, US: Pearson IT Certification. p. 349. ISBN   978-0-7897-5167-6.
  4. Skiera, Bernd; Miller, Klaus; Jin, Yuxi; Kraft, Lennart; Laub, René; Schmitt, Julia (2022). The impact of the GDPR on the online advertising market. Frankfurt am Main. ISBN   978-3-9824173-0-1. OCLC   1303894344.{{cite book}}: CS1 maint: location missing publisher (link)
  5. Schwartz, Paul M; Solove, Daniel (2014). "Reconciling Personal Information in the United States and European Union". California Law Review. 102 (4). doi:10.15779/Z38Z814. S2CID   141313154.
  6. 1 2 "NIST Special Publication 800-122" (PDF). nist.gov.PD-icon.svg This article incorporates public domain material from the National Institute of Standards and Technology
  7. Section 3.3.3 "Identifiability"
  8. "Personal Data". General Data Protection Regulation (GDPR). Retrieved 23 October 2020.
  9. "European Court of Justice rules IP addresses are personal data". The Irish Times. 19 October 2016. Retrieved 10 March 2019.
  10. Nokhbeh, Razieh (2017). "A study of web privacy policies across industries". Journal of Information Privacy & Security. 13: 169–185.
  11. "Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)". European Data Consilium. 11 June 2015. Retrieved 3 April 2019.
  12. 1 2 M-07-16 SUBJECT:Safeguarding Against and Responding to the Breach of Personally Identifiable Information Archived 8 February 2020 at the Wayback Machine , from Clay Johnson III, Deputy Director for Management (2007/05/22)
  13. 1 2 "Guide to Protecting the Confidentiality of Personally Identifiable Information (PII)" (PDF). NIST. Special Publication 800-122.
  14. "Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data". Eur-lex.europa.eu. Retrieved 20 August 2013.
  15. "What is personal data?". TrueVault.
  16. 1 2 "Text of California Senate Bill SB 1386 ref paragraph SEC. 2 1798.29.(e)". California.
  17. Information Commissioner's Office, Data storage, sharing and security, updated on 30 August 2013, accessed on 7 October 2024
  18. "Comments of Latanya Sweeney, PhD on 'Standards of Privacy of Individually Identifiable Health Information'". Carnegie Mellon University. Archived from the original on 28 March 2009.
  19. Ragan, Steve (19 December 2011). "The FBI's warning about doxing was too little too late". The Tech Herald. Archived from the original on 31 October 2012. Retrieved 23 October 2012.
  20. Sheets, Connor Adams (4 January 2012). "Anonymous's Operation Hiroshima: Inside the Doxing Coup the Media Ignored (VIDEO)". International Business Times. Retrieved 12 August 2023.
  21. "Did LulzSec Trick Police into Arresting the Wrong Guy?". The Atlantic Wire. 28 July 2011. Archived from the original on 29 October 2013. Retrieved 23 October 2012.
  22. Bright, Peter (7 March 2012). "Doxed: how Sabu was outed by former Anons long before his arrest". Ars Technica. Retrieved 23 October 2012.
  23. "Privacy Act 1988" . Retrieved 15 May 2019.
  24. "Data protection". European Commission. 11 April 2017.
  25. "Less Is (Not) More: The Need for Adequate Data Protection Practices When Monetizing Personal Information". Mayer Brown. 28 September 2023.
  26. "Data Protection Act 2018", legislation.gov.uk , The National Archives, 2018 c. 12, retrieved 14 August 2018
  27. "The Telecommunications (Lawful Business Practice) (Interception of Communications) Regulations 2000", legislation.gov.uk , The National Archives, SI 2000/1
  28. "New Zealand - Data Protection Overview". DataGuidance. 5 December 2023. Retrieved 14 March 2024.
  29. 1 2 Federal Act on Data Protection of 19 June 1992 (status as of 1 January 2014), Federal Chancellery of Switzerland (page visited on 18 September 2016).
  30. 1 2 (in French) Cesla Amarelle, Droit suisse, Éditions Loisirs et pédagogie, 2008.
  31. "Privacy Act of 1974". www.justice.gov. 16 June 2014. Retrieved 6 December 2020.
  32. Rana, R.; Zaeem, R. N.; Barber, K. S. (October 2018). "US-Centric vs. International Personally Identifiable Information: A Comparison Using the UT CID Identity Ecosystem". 2018 International Carnahan Conference on Security Technology (ICCST). pp. 1–5. doi:10.1109/CCST.2018.8585479. ISBN   978-1-5386-7931-9. S2CID   56719139.
  33. "HIGH-RISK SERIES Urgent Actions Are Needed to Address Cybersecurity Challenges Facing the Nation" (PDF). United States Government Accountability Office. September 2018. Retrieved 16 November 2020.
  34. "California Supreme Court Holds that Zip Code is Personal Identification Information". Bullivant Houser Bailey Business Matters eAlert. LexisNexis.
  35. "NRS: CHAPTER 603A - SECURITY AND PRIVACY OF PERSONAL INFORMATION". www.leg.state.nv.us.
  36. "201 CMR 17.00: Standards for The Protection of Personal Information of Residents of the Commonwealth" (PDF). Commonwealth of Massachusetts.
  37. Tyler v. Michaels Stores, Inc., 984N.E.2d 737, 739 (2013)
  38. "EU-US data transfers". European Commission. 10 July 2023. Retrieved 12 August 2023.
  39. "Anonymity and PII". cookieresearch.com. Archived from the original on 17 June 2011. Retrieved 6 May 2015.
  40. Sawer, Patrick (13 December 2008). "Police use glove prints to catch criminals" . Telegraph. Archived from the original on 11 January 2022. Retrieved 20 August 2013.
  41. James W.H. McCord and Sandra L. McCord, Criminal Law and Procedure for the paralegal: a systems approach, supra, p. 127.
  42. John J. Harris, Disguised Handwriting, 43 J. Crim. L. Criminology & Police Sci. 685 (1952-1953)
  43. Davies, Tom (8 February 2019). "EE failures show how data breaches damages lives". PrivSec Report. Archived from the original on 5 February 2021.
  44. Miller, Michael (2008). Is It Safe? Protecting Your Computer, Your Business, and Yourself Online. Que. p. 4. ISBN   9780132713900.
  45. "Card data of 20,000 Pakistani bank users sold on dark web: report". Dunya News. 6 November 2018.
  46. Miller, Michael (2008). Is It Safe? Protecting Your Computer, Your Business, and Yourself Online. Que. p. 6. ISBN   9780132713900.
  47. Krombholz, Katharina; Merkl, Dieter; Weippl, Edgar (26 July 2012). "Fake Identities in Social Media: A Case Study on the Sustainability of the Facebook Business Model". Journal of Service Science Research. 4 (2): 175–212. doi:10.1007/s12927-012-0008-z. S2CID   6082130.
  48. "Memorandum for DoD FOIA Offices" (PDF). United States Department of Defense. Archived from the original (PDF) on 6 August 2020. Retrieved 1 April 2019.
  49. "Protection of victims of sexual violence: Lessons learned" (PDF). Office of the United Nations High Commissioner for Human Rights. 2019.
  50. "How to Remove Personal Information From Data Broker Sites". McAfee. 24 August 2022. Archived from the original on 27 February 2024.
  51. "How to remove your personal information from the Internet". OneRep. 27 December 2023. Retrieved 16 May 2024.
  52. Osborne, Charlie (8 December 2023). "The best services for deleting yourself from the internet in 2024". ZDNET. Retrieved 20 February 2024.
  53. Fadilpašić, Sead (16 November 2023). "What Are Data Removal Services, and What Data Can They Remove?". MUO. Archived from the original on 17 January 2024. Retrieved 29 November 2023.
  54. Acquisti, Alessandro; Taylor, Curtis; Wagman, Liad (2015). The Economics of Privacy (PDF).
  55. Daughety, A.; Reinganum, J. (2010). "Public goods, social pressure, and the choice between privacy and publicity". American Economic Journal: Microeconomics. 2 (2): 191–221. CiteSeerX   10.1.1.544.9031 . doi:10.1257/mic.2.2.191.
  56. Varian, H. R. (1997). "Economic aspects of personal privacy". Privacy and Self-regulation in the Information Age.
  57. Laudon, K. (1997). Extensions to the theory of markets and privacy: Mechanics of pricing information (PDF).
  58. Taylor, C. R. (2004). "Consumer privacy and the market for customer information". The RAND Journal of Economics. 35 (4): 631–650. hdl: 10161/2627 . JSTOR   1593765.
  59. Cornière, A. D. (2011). "Search advertising". American Economic Journal: Microeconomics. 8 (3): 156–188. doi:10.1257/mic.20130138.
  60. Anderson, S.; de Palma, A. (2012). "Competition for attention in the information (overload) age". The RAND Journal of Economics. 43: 1–25. doi:10.1111/j.1756-2171.2011.00155.x. S2CID   11606956.