Gathering of personally identifiable information

Last updated

The gathering of personally identifiable information (PII) refers to the collection of public and private personal data that can be used to identify individuals for various purposes, both legal and illegal. PII gathering is often seen as a privacy threat by data owners, while entities such as technology companies, governments, and organizations utilize this data to analyze consumer behavior, political preferences, and personal interests.

Contents

With advances in information technology, access to and sharing of PII have become easier. Smartphones and social media have significantly contributed to the widespread collection of personal data, making it a pervasive and controversial issue. [1]

Recent cases of illegal PII collection, such as the Cambridge Analytica scandal involving the data of over 87 million Facebook users, have heightened concerns about privacy violation and increased demands for stronger data protection laws. Major breaches at companies like Equifax, Target, Yahoo, Home Depot, and the United States Office of Personnel Management have compromised the personal and financial data of millions of Americans, leading to calls for improved information security and PII protection. [2]

Definition

Currently, there is no universally accepted definition of PII gathering. According to the U.S. National Institute of Standards and Technology (NIST), PII is defined as: [3]

(1) Any information that can be used to distinguish or trace an individual's identity, such as a name, social security number, date and place of birth, mother's maiden name, or biometric records. (2) Any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.

PII gathering refers to the collection, organization, manipulation, analysis, exchange, or sharing of such data.

Collectors

Governments

Governments collect PII to provide social and legal benefits, improve services, and fulfill legal obligations. [4] Depending on the type of government, whether democratic or authoritarian, methods for collecting PII may vary, but the goals are generally similar. [5]

United States

In the U.S., PII is gathered through processes like tax filing, property registration, and driver's license applications. [6] The government also collects PII for crime prevention and national security purposes, though such practices, especially by the National Security Agency (NSA), remain controversial. [7]

China

China uses big data to enhance governance, employing advanced surveillance networks like the "Skynet" system with 20 million cameras. Although regulations protect PII collected by private companies, there are no limitations on government collection of such data, nor have any plans been made to implement such limitations. [8] [9]

European Union

European Union nations have stringent domestic and international PII regulations. [10] For example, the General Data Protection Regulation (GDPR) provides comprehensive protections for personal data. [11]

Companies

With advancements in internet and mobile technologies, private companies collect PII through user registrations, location tracking, cookies, and other methods. Data brokers buy, sell, and analyze PII from various sources, often without user consent. [12] The Facebook–Cambridge Analytica data scandal is an example of data misuse, where only a fraction of the users whose data was collected had consented. [13]

Hackers

Hackers illegally collect PII for financial or political gain. Notable examples include North Korean hackers targeting Sony Pictures and the large-scale breach at Equifax that exposed sensitive data from millions of users.

PII gathering is often associated with violation of privacy and is often opposed by privacy advocates. Democratic countries, such as the United States and those in the European Union have more developed privacy laws against PII gathering. Laws in the European Union offer more comprehensive and uniform protection of personal data. In the United States, federal data protection laws are approached by sectors. [14] Authoritarian countries often lack PII gathering protection for citizens. For example, Chinese citizens enjoy legislative protection against private companies but have no protection from government violations. [15]

European Union

The GDPR will take effect on May 25, 2018, and offers comprehensive privacy protection consistent across all sectors and industries. The regulation applies to all businesses and government agencies in the European Union countries. It also regulates all foreign companies and organizations offering services in Europe. Violation and non-compliance with the GDPR may result in penalties of up to 4 percent of the business' worldwide annual revenue. GDPR requires businesses and government agencies to get consent for data processing, make anonymous of collect data, provide quick notifications for data breaches, safe handling of data transfer across borders, and appointment of data protection officers. [16]

United States

Section 5 of the Federal Trade Commission Act (FTC Act) is used to make companies safeguard collected PII data. [17] A company in the United States is not required to have a privacy policy, but is obliged to comply if the company disclosed a privacy policy. The company also cannot retroactively change its data collection policy without offering an opportunity for users to opt-out. The FTC imposed a $100 million penalty on LifeLock for failure to protect customer's PII data, such as social security numbers, credit card numbers, and bank account numbers, and violated the terms of a 2010 federal court order. [18]

The FTC also uses the Behavioral Advertising Principe to provide guidelines and suggestions for website operators on data collection practices, activity tracking, and opt-out mechanisms. A website operator is requested to obtain express consent before sensitive PII data, such as social security numbers, financial data, health information, and data of minors is collected and used. The Behavioral Advertising Principe also calls for reasonable security to protect the collected personal data and limited length of data retention but for as long as is necessary to fulfill a legitimate business or law enforcement need. The principle is also self-regulatory and intended to encourage more discussion and further development by all interested parties. [19]

Concerns

Public concern about PII gathering centers around privacy violations and potential discrimination. The unauthorized collection and use of data, as seen in the Cambridge Analytica scandal, has fueled distrust in major platforms like Facebook, with many users demanding stricter government regulation. [20] [21] Risks of PII gathering include discrimination, the loss of individual and collective freedom, monetary risk,, social risk, physical risk, and psychological risk. [22]

See also

Related Research Articles

Consumer privacy is information privacy as it relates to the consumers of products and services.

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

<span class="mw-page-title-main">Information Commissioner's Office</span> Non-departmental public body

The Information Commissioner's Office (ICO) is a non-departmental public body which reports directly to the Parliament of the United Kingdom and is sponsored by the Department for Science, Innovation and Technology. It is the independent regulatory office dealing with the Data Protection Act 2018 and the General Data Protection Regulation, the Privacy and Electronic Communications Regulations 2003 across the UK; and the Freedom of Information Act 2000 and the Environmental Information Regulations 2004 in England, Wales and Northern Ireland and, to a limited extent, in Scotland. When they audit an organisation they use Symbiant's audit software.

Data security means protecting digital data, such as those in a database, from destructive forces and from the unwanted actions of unauthorized users, such as a cyberattack or a data breach.

Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.

A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.

Personal data, also known as personal information or personally identifiable information (PII), is any information related to an identifiable person.

Information privacy, data privacy or data protection laws provide a legal framework on how to obtain, use and store data of natural persons. The various laws around the world describe the rights of natural persons to control who is using its data. This includes usually the right to get details on which data is stored, for what purpose and to request the deletion in case the purpose is not given anymore.

Privacy law is a set of regulations that govern the collection, storage, and utilization of personal information from healthcare, governments, companies, public or private entities, or individuals.

The United States Commission's fair information practice principles (FIPPs) are guidelines that represent widely accepted concepts concerning fair information practice in an electronic marketplace.

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.

Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.

<span class="mw-page-title-main">General Data Protection Regulation</span> EU regulation on the processing of personal data

The General Data Protection Regulation, abbreviated GDPR, or French RGPD is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.

<span class="mw-page-title-main">Chris Hoofnagle</span>

Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.

Corporate surveillance describes the practice of businesses monitoring and extracting information from their users, clients, or staff. This information may consist of online browsing history, email correspondence, phone calls, location data, and other private details. Acts of corporate surveillance frequently look to boost results, detect potential security problems, or adjust advertising strategies. These practices have been criticized for violating ethical standards and invading personal privacy. Critics and privacy activists have called for businesses to incorporate rules and transparency surrounding their monitoring methods to ensure they are not misusing their position of authority or breaching regulatory standards.

Privacy engineering is an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy. Its focus lies in organizing and assessing methods to identify and tackle privacy concerns within the engineering of information systems.

A dark pattern is "a user interface that has been carefully crafted to trick users into doing things, such as buying overpriced insurance with their purchase or signing up for recurring bills". User experience designer Harry Brignull coined the neologism on 28 July 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces". In 2023 he released the book Deceptive Patterns.

The right of access, also referred to as right to access and (data) subject access, is one of the most fundamental rights in data protection laws around the world. For instance, the United States, Singapore, Brazil, and countries in Europe have all developed laws that regulate access to personal data as privacy protection. The European Union states that: "The right of access occupies a central role in EU data protection law's arsenal of data subject empowerment measures." This right is often implemented as a Subject Access Request (SAR) or Data Subject Access Request (DSAR).

<span class="mw-page-title-main">California Privacy Rights Act</span> Privacy and data protection law in California, U.S.

The California Privacy Rights Act of 2020 (CPRA), also known as Proposition 24, is a California ballot proposition that was approved by a majority of voters after appearing on the ballot for the general election on November 3, 2020. This proposition expands California's consumer privacy law and builds upon the California Consumer Privacy Act (CCPA) of 2018, which established a foundation for consumer privacy regulations.

<span class="mw-page-title-main">Personal Information Protection Law of the People's Republic of China</span> Chinese personal information rights law

The Personal Information Protection Law of the People's Republic of China referred to as the Personal Information Protection Law or ("PIPL") protecting personal information rights and interests, standardize personal information handling activities, and promote the rational use of personal information. It also addresses the transfer of personal data outside of China.

References

  1. Li, Xiao Bai; Motiwalla, Luvai F. (2016). "Unveiling consumers' privacy paradox behavior in an economic exchange". International Journal of Business Information Systems. 23 (3): 307–329. doi:10.1504/IJBIS.2016.10000351. PMC   5046831 . PMID   27708687.
  2. "Cybersecurity Incidents". U.S. Office of Personnel Management.
  3. Milne, George R.; Pettinico, George; Hajjat, Fatima M.; Markos, Ereni (March 2017). "Information Sensitivity Typology: Mapping the Degree and Type of Risk Consumers Perceive in Personal Data Sharing". Journal of Consumer Affairs. 51 (1): 133–161. doi:10.1111/joca.12111.
  4. Cappello, Lawrence (December 15, 2016). "Big Iron and the Small Government: On the History of Data Collection and Privacy in the United States". Journal of Policy History. 29 (1): 177–196. doi:10.1017/S0898030616000397.
  5. Li, Xiao Bai; Motiwalla, Luvai F. (2016). "Unveiling consumers' privacy paradox behavior in an economic exchange". International Journal of Business Information Systems. 23 (3): 307–329. doi:10.1504/IJBIS.2016.10000351. PMC   5046831 . PMID   27708687.
  6. Cappello, Lawrence (December 15, 2016). "Big Iron and the Small Government: On the History of Data Collection and Privacy in the United States". Journal of Policy History. 29 (1): 177–196. doi:10.1017/S0898030616000397.
  7. Taylor, Isaac (2017). "Data collection, counterterrorism and the right to privacy". Politics, Philosophy & Economics. 16 (3): 326–346. doi:10.1177/1470594x17715249.
  8. ZENG, JINGHAN (November 2016). "China's date with big data: will it strengthen or threaten authoritarian rule?". International Affairs. 92 (6): 1443–1462. doi:10.1111/1468-2346.12750.
  9. ZENG, JINGHAN (November 2016). "China's date with big data: will it strengthen or threaten authoritarian rule?". International Affairs. 92 (6): 1443–1462. doi:10.1111/1468-2346.12750.
  10. Mikkonen, Tomi (April 1, 2014). "Perceptions of controllers on EU data protection reform: A Finnish perspective". Computer Law & Security Review. 30 (2): 190–195. doi:10.1016/j.clsr.2014.01.011. ISSN   0267-3649.
  11. Mikkonen, Tomi (April 1, 2014). "Perceptions of controllers on EU data protection reform: A Finnish perspective". Computer Law & Security Review. 30 (2): 190–195. doi:10.1016/j.clsr.2014.01.011. ISSN   0267-3649.
  12. "Privacy considerations of online behavioral tracking". European Union Agency for Network and Information Security.
  13. Solon, Olivia (April 4, 2018). "Facebook says Cambridge Analytica may have gained 37m more users' data". the Guardian.
  14. "A Comparison Between US and EU Data Protection Legislation for Law Enforcement Purposes - Think Tank". www.europarl.europa.eu.
  15. ZENG, JINGHAN (November 2016). "China's date with big data: will it strengthen or threaten authoritarian rule?". International Affairs. 92 (6): 1443–1462. doi:10.1111/1468-2346.12750.
  16. "Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons about the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance)". May 4, 2016.
  17. "Privacy & Data Security Update (2016)". Federal Trade Commission. January 18, 2017.
  18. "LifeLock to Pay $100 Million to Consumers to Settle FTC Charges it Violated 2010 Order". Federal Trade Commission. December 17, 2015.
  19. "Online Behavioral Advertising: Moving the Discussion Forward to Possible Self-Regulatory Principles: Statement of the Bureau of Consumer Protection Proposing Governing Principles For Online Behavioral Advertising and Requesting Comment". Federal Trade Commission. January 16, 2014. Archived from the original on February 9, 2021. Retrieved April 19, 2018.
  20. "Facebook Users' Privacy Concerns Up Since 2011". Gallup.
  21. "The Spotlight's on Facebook, but Google Is Also in the Privacy Hot Seat". NDTV Gadgets360.com.
  22. Milne, George R.; Pettinico, George; Hajjat, Fatima M.; Markos, Ereni (March 2017). "Information Sensitivity Typology: Mapping the Degree and Type of Risk Consumers Perceive in Personal Data Sharing". Journal of Consumer Affairs. 51 (1): 133–161. doi:10.1111/joca.12111.