Gathering of personally identifiable information

Last updated

The gathering of personally identifiable information (PII) is the practice of collecting public and private personal data that can be used to identify an individual for both legal and illegal applications. PII owners often view PII gathering as a threat and violation of their privacy. Meanwhile, entities such as information technology companies, governments, and organizations use PII for data analysis of consumer shopping behaviors, political preference, and personal interests.

Contents

With the development of new information technology, PII is easier to access and share than before. The use of smartphones and social media has contributed to the widespread usage of PII gathering. PII is collected anywhere and anytime. The dissemination of personal data makes PII gathering a hotly debated social issue. [1]

Recent illegal PII gathering by data collection companies, such as Cambridge Analytica on Facebook of over 87 million users, has caused increasing concern over privacy violation and has renewed call for more comprehensive data protection laws. Major security breaches at Equifax, Target, Yahoo, Home Depot, and the United States Office of Personnel Management impacted personal and financial information of millions of American, with calls for increasing information technology security and protection of PII data by businesses and governmental agencies. [2]

Definition

There is no precise definition for PII gathering. According to the U.S. National Institute of Standards and Technology (NIST), PII is defined as: [3]

(1) any information that can be used to distinguish or trace an individual's identity, such as name, social security number, date and place of birth, mother's maiden name, or biometric records and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial and employment information.

PII gathering is any activity that collects, organizes, manipulates, analyzes, exchanges, or shares this data.

Collectors

Governments

Governments publicly collect PII to extend social and legal benefits, such as improving social services and when fulfilling legal obligations. [4] [5]

Depending on a country's governmental archetype, such as democratic or authoritarian, PII gathering is conducted using different methods. Regardless, countries share similar goals with PII gathering, as demonstrated by the example below. [6]

United States

In the United States, PII is gathered through application for assistance, registration of property, tax filing, registration for selective services, application for driver's license, government employment, professional licensure, and other voluntary and mandatory information submission. PII is stored, accessed, and shared between different levels of government, departments, agencies, non-governmental entities, and the public. [7] For example, a potential home buyer can look up if a real estate agent is licensed or not. The Government also gathers PII for crime prevention and national security purposes. Many of the programs are highly controversial among the US public. For example, the National Security Agency (NSA) collects and analyzes PII, including phone calls, emails, and social media interactions, from large numbers of people to uncover potential threats. [8]

China

The Chinese government has made big data part of the governance strategy. The goal is a more efficient and transparent government through the use of digital technology. The government has implemented one of the most technologically advanced surveillance networks on the planet called the "Skynet(天网监控系统)". The system adopts artificial intelligence including facial recognition. 20 million cameras were installed to cover nearly every single public space in the country. [9] PII protection in China deals with collection by private companies and organizations. There has been no discussion or proposal about limiting government involvement in collecting, gathering, and analyzing PII. [10]

Finland

Even in western democratic systems, there are different constraints on PII gathering. Nations in the European Union adopt stricter regulation on PII collecting than the United States. [11] Similarly, personal data processing in Finland has been protected under comprehensive regulations and laws. The Personal Data Act in 1999 was the main national privacy regulation alongside the 1995 European Union Data Protection Directive. Other data regulations enacted in Finland include the Act on the Protection of Privacy in Electronic Communication, the Act on the Protection of Privacy in Working Life, and the Act on the Openness of Governmental Activities. The Personal Data Act was replaced by European Union General Data Protection Regulation (GDPR) which will take effect in May 2018. Enforcement of privacy regulations have gotten stricter in recent years after a ruling by the European Court of Human Rights which found that a Finnish hospital failed to safeguard personal data. [12]

Companies

With the rapid growth and development of Internet and mobile technologies, private companies are able to collect personal data more quickly and effectively than before. Companies gather PII by storing of profile information when users register a new account, tracking user's location, tracking user's local storage, and using cookies and other anonymous identifiers.

Data brokers, also known as information brokers, are the major dealers of gathering, transforming, packaging, and selling of personal data. They gather PII from these resources: 1) Government documents and records, e.g. registration information, crime records. 2) Publicly available sources: including social media, blogs, and Internet websites. For example, Facebook users frequently post their personal information online and share their preferred links. As the site requires users to register with their real identities as required, it offers the opportunity for data broker to store and analyze the individual's personality and preference. [13] 3) Approved companies, businesses, or services that are authorized by users willingly or sometime unknowingly to access their personal profiles. Similarly, online users are often asked to provide PII in order to register an account on a website. The website will then inform users about data gathering and benefits of storing the data, such as no need to enter the password every time and more effective on personal advertisements. However, these approved companies would sell PII collected and stored to data broker and mostly without users' knowledge or consent. [14] The Facebook–Cambridge Analytica data scandal is an example. Cambridge Analytica traced personality traits from potential voters' activities on Facebook, such as their "likes" and locations, and used this personal information to predict voting behaviors. Cambridge Analytica acquired over 87 million users' PII. Only about 270,000 consented for their data for academic uses, while all other users' PII is collected illegally by Cambridge Analytica. [15] [16]

Hackers

Hackers are individuals or organizations that collect PII data illegally. They are driven mostly by financial interests, but sometimes political, such as the hacking of Sony by North Korean hackers. Hackers from North Korea targeted Sony Pictures in retaliation for the planned release of "The Interview," a movie about the fictional assassination of North Korean leader Kim Jong-Un. The incident resulted in the release of Social Security numbers, salary information, and medical records of Sony employees. Hackers use spyware, viruses, backdoors, social engineering or other methods to steal and collect PII data from individuals, companies, governments, and other organization. For example, Equifax, one of the largest credit company in the world, its security was compromised by hackers and PII for millions of Americans was stolen.

PII gathering is often associated with violation of privacy and is often opposed by privacy advocates. Democratic countries, such as the United States and those in the European Union have more developed privacy laws against PII gathering. Laws in the European Union offer more comprehensive and uniform protection of personal data. In the United States, federal data protection laws are approached by sectors. [17] Authoritarian countries often lack PII gathering protection for citizens. For example, Chinese citizens enjoy legislative protection against private companies, but have no protection from government violation. [18]

European Union

The GDPR will take effect on May 25, 2018 and offers comprehensive privacy protection consistent across all sectors and industries. The regulation applies to all businesses and government agencies in the European Union countries. It also regulates all foreign companies and organizations offering services in Europe. Violation and non-compliance of the GDPR may result in penalties up 4 percent of the business' worldwide annual revenue. GDPR requires businesses and government agencies to get consent for data processing, make anonymous of collected data, provide quick notifications for data breach, safe handling of data transfer across borders, and appointment of data protection officer. [19]

United States

The section 5 of the Federal Trade Commission Act (FTC Act) is used to make companies safeguard collected PII data. [20] A company in the United States is not required to have a privacy policy, but is obliged to comply if the company disclosed a privacy policy. The company also cannot retroactively change its data collection policy without offering an opportunity for users to opt out. The FTC imposed a $100 million penalty on LifeLock for failure to protect customer's PII data, such as social security numbers, credit card numbers, and bank account numbers, and violated the terms of a 2010 federal court order. [21]

The FTC also uses the Behavioral Advertising Principe to provide guidelines and suggestions for website operators on data collection practices, activity tracking, and opt-out mechanism. A website operator is requested to obtain express consent before sensitive PII data, such as social security numbers, financial data, health information, and data of minors is collected and used. The Behavioral Advertising Principe also calls for reasonable security to protect the collected personal data and limited length of data retention, but for as long as is necessary to fulfill a legitimate business or law enforcement need. The principle is also self-regulatory and intended to encourage more discussion and further development by all interested parties. [22]

Concerns

PII gathering is usually viewed by the public as a violation of privacy. A major concern is that PII gathering allows for the classification of individual and groups, which leads to discrimination and loss of individual and collective freedom. Other perceived risks include: "(1) monetary risk is the risk associated with potential financial loss, (2) social risk is the risk associated with threats to an individual's self-esteem, reputation, and/or the perceptions of others, (3) physical risk is the risk associated with bodily injury, and (4) psychological risk is the risk associated with potential negative emotions such as anxiety, distress, and/or conflicts with self-image." [23]

A 2018 Gallup poll indicated that more people are now concerned with invasion of privacy and data gathering after the revelation that personal data of Facebook users was collected and shared with Cambridge Analytica without consent. The survey showed that 43% of Facebook users are "very concerned" compared to 30% in 2011, with similar responses from Google users. [24] There is also increasing concerns that personal data is being collected even if users are not logged in or not using the services. The data is collected to target users with tailored advertising services. [25] Concerns over unauthorized data collection and use has resulted in many users stopping using Facebook or moving to other social media platforms, with increasing call for broad privacy regulation from the government, including the ability for users opt out of data collection completely. [26]

See also

Related Research Articles

Consumer privacy is information privacy as it relates to the consumers of products and services.

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

<span class="mw-page-title-main">Information Commissioner's Office</span> Non-departmental public body

The Information Commissioner's Office (ICO) is a non-departmental public body which reports directly to the Parliament of the United Kingdom and is sponsored by the Department for Science, Innovation and Technology. It is the independent regulatory office dealing with the Data Protection Act 2018 and the General Data Protection Regulation, the Privacy and Electronic Communications Regulations 2003 across the UK; and the Freedom of Information Act 2000 and the Environmental Information Regulations 2004 in England, Wales and Northern Ireland and, to a limited extent, in Scotland. When they audit an organisation they use Symbiant's audit software.

Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.

A data broker is an individual or company that specializes in collecting personal data or data about companies, mostly from public records but sometimes sourced privately, and selling or licensing such information to third parties for a variety of uses. Sources, usually Internet-based since the 1990s, may include census and electoral roll records, social networking sites, court reports and purchase histories. The information from data brokers may be used in background checks used by employers and housing.

A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.

Personal data, also known as personal information or personally identifiable information (PII), is any information related to an identifiable person.

Information privacy, data privacy or data protection laws provide a legal framework on how to obtain, use and store data of natural persons. The various laws around the world describe the rights of natural persons to control who is using its data. This includes usually the right to get details on which data is stored, for what purpose and to request the deletion in case the purpose is not given anymore.

Privacy law is a set of regulations that govern the collection, storage, and utilization of personal information from healthcare, governments, companies, public or private entities, or individuals.

<span class="mw-page-title-main">Digital privacy</span>

Digital privacy is often used in contexts that promote advocacy on behalf of individual and consumer privacy rights in e-services and is typically used in opposition to the business practices of many e-marketers, businesses, and companies to collect and use such information and data. Digital privacy, a crucial aspect of modern online interactions and services, can be defined under three sub-related categories: information privacy, communication privacy, and individual privacy.

Since the arrival of early social networking sites in the early 2000s, online social networking platforms have expanded exponentially, with the biggest names in social media in the mid-2010s being Facebook, Instagram, Twitter and Snapchat. The massive influx of personal information that has become available online and stored in the cloud has put user privacy at the forefront of discussion regarding the database's ability to safely store such personal information. The extent to which users and social media platform administrators can access user profiles has become a new topic of ethical consideration, and the legality, awareness, and boundaries of subsequent privacy violations are critical concerns in advance of the technological age.

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.

Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.

<span class="mw-page-title-main">General Data Protection Regulation</span> EU regulation on the processing of personal data

The General Data Protection Regulation is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.

<span class="mw-page-title-main">Chris Hoofnagle</span>

Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.

Corporate surveillance describes the practice of businesses monitoring and extracting information from their users, clients, or staff. This information may consist of online browsing history, email correspondence, phone calls, location data, and other private details. Acts of corporate surveillance frequently look to boost results, detect potential security problems, or adjust advertising strategies. These practices have been criticized for violating ethical standards and invading personal privacy. Critics and privacy activists have called for businesses to incorporate rules and transparency surrounding their monitoring methods to ensure they are not misusing their position of authority or breaching regulatory standards.

A dark pattern is "a user interface that has been carefully crafted to trick users into doing things, such as buying overpriced insurance with their purchase or signing up for recurring bills". User experience designer Harry Brignull coined the neologism on 28 July 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces". In 2023 he released the book Deceptive Patterns.

The Biometric Information Privacy Act is a law set forth on October 3, 2008 in the U.S. state of Illinois, in an effort to regulate the collection, use, and handling of biometric identifiers and information by private entities. Notably, the Act does not apply to government entities. While Texas and Washington are the only other states that implemented similar biometric protections, BIPA is the most stringent. The Act prescribes $1,000 per violation, and $5,000 per violation if the violation is intentional or reckless. Because of this damages provision, the BIPA has spawned several class action lawsuits.

The California Consumer Privacy Act (CCPA) is a state statute intended to enhance privacy rights and consumer protection for residents of the state of California in the United States. The bill was passed by the California State Legislature and signed into law by the Governor of California, Jerry Brown, on June 28, 2018, to amend Part 4 of Division 3 of the California Civil Code. Officially called AB-375, the act was introduced by Ed Chau, member of the California State Assembly, and State Senator Robert Hertzberg.

<span class="mw-page-title-main">Personal Information Protection Law of the People's Republic of China</span> Chinese personal information rights law

The Personal Information Protection Law of the People's Republic of China referred to as the Personal Information Protection Law or ("PIPL") protecting personal information rights and interests, standardize personal information handling activities, and promote the rational use of personal information. It also addresses the transfer of personal data outside of China.

References

  1. Li, Xiao Bai; Motiwalla, Luvai F. (2016). "Unveiling consumers' privacy paradox behavior in an economic exchange". International Journal of Business Information Systems. 23 (3): 307–329. doi:10.1504/IJBIS.2016.10000351. PMC   5046831 . PMID   27708687.
  2. "Cybersecurity Incidents". U.S. Office of Personnel Management.
  3. Milne, George R.; Pettinico, George; Hajjat, Fatima M.; Markos, Ereni (March 2017). "Information Sensitivity Typology: Mapping the Degree and Type of Risk Consumers Perceive in Personal Data Sharing". Journal of Consumer Affairs. 51 (1): 133–161. doi:10.1111/joca.12111.
  4. Cappello, Lawrence (December 15, 2016). "Big Iron and the Small Government: On the History of Data Collection and Privacy in the United States". Journal of Policy History. 29 (1): 177–196. doi:10.1017/S0898030616000397.
  5. Baek, Young Min; Bae, Young; Jeong, Irkwon; Kim, Eunmee; Rhee, June Woong (December 2014). "Changing the default setting for information privacy protection: What and whose personal information can be better protected?". The Social Science Journal. 51 (4): 523–533. doi:10.1016/j.soscij.2014.07.002.
  6. Li, Xiao Bai; Motiwalla, Luvai F. (2016). "Unveiling consumers' privacy paradox behavior in an economic exchange". International Journal of Business Information Systems. 23 (3): 307–329. doi:10.1504/IJBIS.2016.10000351. PMC   5046831 . PMID   27708687.
  7. Cappello, Lawrence (December 15, 2016). "Big Iron and the Small Government: On the History of Data Collection and Privacy in the United States". Journal of Policy History. 29 (1): 177–196. doi:10.1017/S0898030616000397.
  8. Taylor, Isaac (2017). "Data collection, counterterrorism and the right to privacy". Politics, Philosophy & Economics. 16 (3): 326–346. doi:10.1177/1470594x17715249.
  9. Taylor, Rebecca (September 26, 2017). "China installs 20million cameras in 'world's most advanced surveillance system'". mirror.
  10. ZENG, JINGHAN (November 2016). "China's date with big data: will it strengthen or threaten authoritarian rule?". International Affairs. 92 (6): 1443–1462. doi:10.1111/1468-2346.12750.
  11. Bennett, Colin J. (November 2016). "Voter databases, micro-targeting, and data protection law: can political parties campaign in Europe as they do in North America?". International Data Privacy Law. 6 (4): 261–275. doi:10.1093/idpl/ipw021.
  12. Mikkonen, Tomi (April 1, 2014). "Perceptions of controllers on EU data protection reform: A Finnish perspective". Computer Law & Security Review. 30 (2): 190–195. doi:10.1016/j.clsr.2014.01.011. ISSN   0267-3649.
  13. "Privacy considerations of online behavioural tracking". European Union Agency for Network and Information Security.
  14. Eugene E., Hutchinson. "Keeping Your Personal Information Personal: Trouble for the Modern Consumer". Hofstra Law Review. 43 (4).
  15. Solon, Olivia (April 4, 2018). "Facebook says Cambridge Analytica may have gained 37m more users' data". the Guardian.
  16. González, Roberto J. (June 1, 2017). "Hacking the citizenry?: Personality profiling, 'big data' and the election of Donald Trump". Anthropology Today. 33 (3): 9–12. doi:10.1111/1467-8322.12348. ISSN   1467-8322.
  17. "A Comparison Between US and EU Data Protection Legislation for Law Enforcement Purposes - Think Tank". www.europarl.europa.eu.
  18. ZENG, JINGHAN (November 2016). "China's date with big data: will it strengthen or threaten authoritarian rule?". International Affairs. 92 (6): 1443–1462. doi:10.1111/1468-2346.12750.
  19. "Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance)". May 4, 2016.
  20. "Privacy & Data Security Update (2016)". Federal Trade Commission. January 18, 2017.
  21. "LifeLock to Pay $100 Million to Consumers to Settle FTC Charges it Violated 2010 Order". Federal Trade Commission. December 17, 2015.
  22. "Online Behavioral Advertising: Moving the Discussion Forward to Possible Self-Regulatory Principles: Statement of the Bureau of Consumer Protection Proposing Governing Principles For Online Behavioral Advertising and Requesting Comment". Federal Trade Commission. January 16, 2014. Archived from the original on February 9, 2021. Retrieved April 19, 2018.
  23. Milne, George R.; Pettinico, George; Hajjat, Fatima M.; Markos, Ereni (March 2017). "Information Sensitivity Typology: Mapping the Degree and Type of Risk Consumers Perceive in Personal Data Sharing". Journal of Consumer Affairs. 51 (1): 133–161. doi:10.1111/joca.12111.
  24. "Facebook Users' Privacy Concerns Up Since 2011". Gallup.
  25. Velasco, Carl (April 17, 2018). "Here's How Facebook Collects Your Data Even When You Never Use It". Tech Times.
  26. "The Spotlight's on Facebook, but Google Is Also in the Privacy Hot Seat". NDTV Gadgets360.com.