Chris Hoofnagle

Last updated

Chris Hoofnagle
CitizenshipAmerican
Known forSurvey research on consumer privacy, Federal Trade Commission
Scientific career
Fields Privacy, computer crime, identity theft
Notable students Ashkan Soltani
Chris Hoofnagle speaking at a panel. Chris Hoofnagle.jpg
Chris Hoofnagle speaking at a panel.

Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. [1] Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. [2] [3] [4] Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.

Contents

Career

Hoofnagle is a professor and attorney at Gunderson Dettmer LLP. [5] He has served as an advisor for several student projects at the University of California, Berkeley School of Information. He advised Ashkan Soltani and his colleagues on their article "Flash Cookies and Privacy". [6]

Hoofnagle and Soltani published a follow-up on this work in 2011 documenting the use of "HTTP ETags" to store persistent identifiers. [7] This research was also published in the Harvard Policy Law Review as "Behavioral Advertising: The Offer You Cannot Refuse," [8] and won the CPDP 2014 Multidisciplinary Privacy Research Award. [9]

Privacy literature contributions

Identity theft

Today, most information regarding identity theft incidents is gathered from the victims whose identities are being stolen. [10] As a result, many aspects regarding identity theft are still unknown. This is because of missing data on synthetic identity theft (situations of identity theft where victims aren't aware of the crime), the fact that most victims don't report identity theft to criminal authorities, and the fact that the FBI may decline to investigate identity theft cases due to lack of resources. [10] In fact, less than one in 32 victims of identity theft file an official complaint on the issue. [11]

Because of these issues, Hoofnagle argues that identity theft information should be gathered from financial institutions. [10] Financial institutions are a central actor during identity theft crimes as they are the institution that imposters use to steal money, they undergo nonpayment after imposters steal money, and they recharge victims after nonpayment. Thus, financial institutions have the most interaction with the imposter, which makes them the best player to gather information about identity theft, according to Hoofnagle. [10] Hoofnagle believes that financial institutions should be required to track the number of identity theft instances that taken place or been avoided, identify the targeted product of the thief, and report the loss suffered or avoided. [10] He argues that these policies will garner more information regarding identity theft, helping institutions avoid the problem in the future. [10]

Hoofnagle's research also discovered that larger institutions that focused on credit card accounts had relatively higher rates of identity fraud than smaller institutions. [11] He argues that this may contradict consumer expectations, as consumers may believe that larger institutions have the tools necessary to avoid identity fraud problems. [11]

Consumer privacy

Social networking services

Although signing up for social networking services (SNSs) like Instagram and Facebook do not cost any money to access, Hoofnagle argues that there is a great price for this transaction: the collection of personal information. [12] As consumers post more on SNSs, the SNSs gather more and more personal information on the consumer. [13] Data can be collected directly by tracking the smartphone owner's posts or storing information from other phone applications on the device. [14] It can also be collected indirectly from information that other people store about the smartphone owner on their own devices. [14] Hoofnagle argues that this transaction represents a loss of privacy for consumers. [12] By freely revealing personal information, consumers leave themselves more vulnerable to data collection, identity theft, fraud, and stalking. [12]

Additionally, consumers do not know how their information will be used in the future. [13] It is almost impossible to delete information that has been posted on SNSs, and consumers do not know how that information will be dealt with. [13]

Internet tracking

There are many methods of internet tracking, including Flash cookies, ETags, HTML5 local storage, Evercookies, and browser fingerprinting. [15] In a study, Hoofnagle discovered that there was a dramatic increase in the use of standard cookies between 2009 and 2011. [15] Additionally, most cookies were placed by third-party hosts, which is mainly made up of advertisers. [15]

Hoofnagle argues that modern privacy regulation would give consumers more choices in the marketplace. [15] He denies that government intervention of this kind is paternalistic in nature.

Commercial data brokers

Commercial data brokers (CDBs) are businesses that collect personal information on individuals and sell it. [16] Hoofnagle argues that CDBs like ChoicePoint perform law enforcement duties by allowing the police to download collections of information about individuals. He thinks that CDBs should be regulated by the Privacy Act of 1974 as a result. [16] He argues that government access to CDBs gives law enforcement information that they would not be able to collect legally, presenting a significant legal issue. [16]

Hoofnagle presents three policy solutions to protect personal data from law enforcement. He believes that commercial and government collection of information shouldn't be distinct, public records should be compatible with modern technology, and the Privacy Act of 1974 should apply to CDBs. [16]

Physical vs digital goods

In "What We Buy When We Buy Now," authors Aaron Perzanowski and Chris Hoofnagle explore a common misconception regarding consumer rights when buying digital goods; specifically, the misconception that the same regulations govern physical and digital media. The authors called their study The Mediashop Study. After conducting a web-based survey, they discovered that most consumers believe that digital goods and physical goods have the same rights to use and transfer. For example, just like an individual can easily transfer a physical book to someone, most consumers believe they have this same ability with digital books. This is not the case under current digital ownership rights. The study also revealed that consumers would be willing to pay more for the right to transfer digital goods and that adding a short notice that explains consumers’ digital rights would be effective in reducing consumer misperceptions. [17]

Privacy policies

Hoofnagle argues that there are limitations to the FTC's privacy policy approach. The Federal Trade Commission (FTC) is the primary consumer protection agency in the United States. [18] Despite the FTC's commitment to the self-regulation of privacy, Hoofnagle argues that consumers are very concerned about their private information being collected. In "The Federal Trade Commission and Consumer Privacy in the Coming Decade," Hoofnagle and the other authors explain how most Americans believe that a company's privacy policy explains how their information will remain private. However, in reality, privacy policies merely detail how website will use a consumer's private information. Based on their research, the authors conclude that privacy notices alone are insufficient for consumer privacy. To advance privacy, the authors suggest that the FTC make three provisions: police the term “privacy policy,” consult with experts in usability to create privacy-protecting mechanisms, and set benchmarks for self-regulation. [19]

Privacy law

Europe

The European Union's General Data Protection Regulation (GDPR) is the E.U.'s law on data protection. Hoofnagle argues that the GDPR is "the most consequential regulatory development in information policy in a generation." [20] The GDPR applies to situations where "personal data" is "processed," so virtually all actions involving personal data are protected under the GDPR. The GDPR also places significant burden on data controllers (e.g. companies) to ensure the privacy of consumer information. For instance, they must keep records of all their data processing, adopt a data protection policy, and be transparent on their data usage. Exemptions to the regulations of the GDPR are data activity of personal use or national security. Consequences for breaking the rules of the GDPR include sanctions and fines, and Data Protection Authorities are the main enforcers of the GDPR's regulations.

The United States

The U.S.'s Privacy Act of 1974 and Fair Credit Reporting Act of 1970 (FCRA) are the framework for U.S. privacy law. [21] Hoofnagle argues that these regulations don't adequately protect privacy, as many companies have found loopholes to them. He argues that the problem with the Privacy Act is that it only applies to the federal government and private companies that work for the government. It does not apply to other private companies or data brokers. [21] Hoofnagle additionally criticizes the FCRA for solely applying to "consumer reporting agencies" that use "consumer reports." [21] Consumer reports solely concern communication on a consumer relating to credit evaluation, employment screening, insurance underwriting, or licensing, and all other uses are not protected by the FCRA. [21]

Hoofnagle and Daniel Solove propose a series of regulations that they label as "The Model Regime" as solutions to the problems they propose with U.S. privacy law. These solutions include: [22]

  1. Universal notice of when companies collect individuals’ private information
  2. Meaningful consent of consumers when data is collected
  3. Meaningful exercise of consumers’ rights,
  4. Effective individual management of consumer reporting
  5. Accessing personal information that companies store
  6. Greater security of information
  7. Disclosing security breaches
  8. Limiting use of social security numbers
  9. Regulating access to public records
  10. Limiting use of background checks
  11. Regulating private investigators
  12. Limiting government access to business and financial records
  13. Regulating government data mining
  14. Updating the Privacy Act
  15. Effectively enforcing privacy rights

Europe and U.S. compared

One key divergence between the United States and Europe with regard to privacy is how privacy is discussed legally. In the U.S., conceptions of privacy are broadly categorized as "privacy" or "information privacy" issues. On the other hand, European law distinguishes between information privacy and data protection. While data protection ensures the due process of data, privacy refers to the right to a private life (e.g. private family life and private home). [23]

Additionally, while the GDPR places the burden of the privacy of consumer information on data controllers, U.S. privacy law places this burden on data subjects. [23] This means that in the U.S., consumers are responsible for reading privacy notices and determining for themselves whether they feel like their private information will be protected. [23]

New transaction systems

In 2013, Hoofnagle conducted an experiment along with Jennifer Urban and Su Li regarding American opinion towards privacy in new transaction systems (e.g. mobile payment systems). An advantage of mobile payment systems is that they serve as a digital wallet, allowing consumers the convenience of making transactions online. They also have the potential for better payment security. However, a privacy concern that Hoofnagle and the authors identify is that this new technology allows merchants to collect personally-identifiable contact information regarding consumers, a feature that is not provided in a typical credit card transactions. The authors' research suggested that Americans are opposed to systems that track them when they browse stores and share their information after purchases (e.g. sharing their phone number). [24]

Cybercrime

In "Deterring Cybercrime: Focus on Intermediaries,” authors Aniket Kesari, Chris Hoofnagle, and Damon McCoy prove how intermediaries can limit cybercrime. According to the authors, cybercriminals rely on many intermediaries to commit illegal acts. These include methods of acquiring new customers, web hosting, collecting payments, and the delivery of products. While most of the legal scholarship on cybercrime grants intermediaries’ general immunity from the illegal acts of users, the authors argue that intermediaries should be required to take action against criminal activities of users. The authors list examples of current methods to force intermediaries to take action. An example of a government-led intervention includes domain name seizures. This is authorized by the PRO-IP Act, giving the federal government the authority to seize a website accused of illegal activity. An example of private companies limiting the harm of cybercriminals includes the eBay Verified Rights Online (VeRO) Program. This program prevents sellers from illegally marketing and selling items. [25]

The tethered economy

Tethering typically refers to linking mobile devices together. However, in "The Tethered Economy" by Chris Hoofnagle, Aniket Kesari, and Aaron Perzanowski, the authors refer to tethering as the connection and dependence of goods on sellers for their operation. Examples of tethered devices include Google Home, Amazon Alexa, smart kitchen appliances, and other Internet of things devices. All of these items depend on consumers for their functionality. The benefits of tethering are that tethered products increase trade generativity, may be safer to use, and have the potential for new and personalized functions over time. One harm of tethering includes the fact that manufacturers decide the durability of products through bricking, feature reduction, altering the terms of the bargain. Tethering also presents information risks, since devices are constantly collecting information on consumer behavior. Lastly, tethering reduces choice and competition in the market, raising switching costs that may lock consumers into particular devices or platforms. For example, it may be hard to switch to Microsoft devices once a consumer already owns many Apple devices. [26]

The authors present legal interventions that can change the relationship between sellers and buyers and address the tethering of the economy. Contracts, tort law, and antitrust and consumer protection laws are all suggested reforms to address consumer problems that arise from tethering; however, the authors argue that no single approach will solve all of the problems discussed in the article. [26]

See also

Related Research Articles

Consumer privacy is information privacy as it relates to the consumers of products and services.

Data security means protecting digital data, such as those in a database, from destructive forces and from the unwanted actions of unauthorized users, such as a cyberattack or a data breach.

Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.

Personal data, also known as personal information or personally identifiable information (PII), is any information related to an identifiable person.

Information privacy, data privacy or data protection laws provide a legal framework on how to obtain, use and store data of natural persons. The various laws around the world describe the rights of natural persons to control who is using its data. This includes usually the right to get details on which data is stored, for what purpose and to request the deletion in case the purpose is not given anymore.

<span class="mw-page-title-main">HTTP cookie</span> Small pieces of data stored by a web browser while on a website

HTTP cookies are small blocks of data created by a web server while a user is browsing a website and placed on the user's computer or other device by the user's web browser. Cookies are placed on the device used to access a website, and more than one cookie may be placed on a user's device during a session.

ePrivacy Directive

Privacy and Electronic Communications Directive2002/58/EC on Privacy and Electronic Communications, otherwise known as ePrivacy Directive (ePD), is an EU directive on data protection and privacy in the digital age. It presents a continuation of earlier efforts, most directly the Data Protection Directive. It deals with the regulation of a number of important issues such as confidentiality of information, treatment of traffic data, spam and cookies. This Directive has been amended by Directive 2009/136, which introduces several changes, especially in what concerns cookies, that are now subject to prior consent.

Security breach notification laws or data breach notification laws are laws that require individuals or entities affected by a data breach, unauthorized access to data, to notify their customers and other parties about the breach, as well as take specific steps to remedy the situation based on state legislature. Data breach notification laws have two main goals. The first goal is to allow individuals a chance to mitigate risks against data breaches. The second goal is to promote company incentive to strengthen data security.Together, these goals work to minimize consumer harm from data breaches, including impersonation, fraud, and identity theft.

Web tracking is the practice by which operators of websites and third parties collect, store and share information about visitors’ activities on the World Wide Web. Analysis of a user's behaviour may be used to provide content that enables the operator to infer their preferences and may be of interest to various parties, such as advertisers. Web tracking can be part of visitor management.

<span class="mw-page-title-main">Evercookie</span> JavaScript application programming interface

Evercookie is a JavaScript application programming interface (API) that identifies and reproduces intentionally deleted cookies on the clients' browser storage. It was created by Samy Kamkar in 2010 to demonstrate the possible infiltration from the websites that use respawning. Websites that have adopted this mechanism can identify users even if they attempt to delete the previously stored cookies.

A zombie cookie is a piece of data usually used for tracking users, which is created by a web server while a user is browsing a website, and placed on the user's computer or other device by the user's web browser, similar to regular HTTP cookies, but with mechanisms in place to prevent the deletion of the data by the user. Zombie cookies could be stored in multiple locations—since failure to remove all copies of the zombie cookie will make the removal reversible, zombie cookies can be difficult to remove. Since they do not entirely rely on normal cookie protocols, the visitor's web browser may continue to recreate deleted cookies even though the user has opted not to receive cookies.

Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of data derived from that activity outside the context in which it occurred.

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.

<span class="mw-page-title-main">General Data Protection Regulation</span> EU regulation on the processing of personal data

The General Data Protection Regulation is a European Union regulation on information privacy in the European Union (EU) and the European Economic Area (EEA). The GDPR is an important component of EU privacy law and human rights law, in particular Article 8(1) of the Charter of Fundamental Rights of the European Union. It also governs the transfer of personal data outside the EU and EEA. The GDPR's goals are to enhance individuals' control and rights over their personal information and to simplify the regulations for international business. It supersedes the Data Protection Directive 95/46/EC and, among other things, simplifies the terminology.

<span class="mw-page-title-main">Ashkan Soltani</span> American computer scientist

Ashkan Soltani is the executive director of the California Privacy Protection Agency. He has previously been the Chief Technologist of the Federal Trade Commission and an independent privacy and security researcher based in Washington, DC.

Data re-identification or de-anonymization is the practice of matching anonymous data with publicly available information, or auxiliary data, in order to discover the person the data belong to. This is a concern because companies with privacy policies, health care providers, and financial institutions may release the data they collect after the data has gone through the de-identification process.

<span class="mw-page-title-main">NOYB</span> European data protection advocacy group

NOYB – European Center for Digital Rights is a non-profit organization based in Vienna, Austria established in 2017 with a pan-European focus. Co-founded by Austrian lawyer and privacy activist Max Schrems, NOYB aims to launch strategic court cases and media initiatives in support of the General Data Protection Regulation (GDPR), the proposed ePrivacy Regulation, and information privacy in general. The organisation was established after a funding period during which it has raised annual donations of €250,000 by supporting members. Currently, NOYB is financed by more than 4,400 supporting members.

The gathering of personally identifiable information (PII) is the practice of collecting public and private personal data that can be used to identify an individual for both legal and illegal applications. PII owners often view PII gathering as a threat and violation of their privacy. Meanwhile, entities such as information technology companies, governments, and organizations use PII for data analysis of consumer shopping behaviors, political preference, and personal interests.

A blockchain is a shared database that records transactions between two parties in an immutable ledger. Blockchain documents and confirms pseudonymous ownership of all transactions in a verifiable and sustainable way. After a transaction is validated and cryptographically verified by other participants or nodes in the network, it is made into a "block" on the blockchain. A block contains information about the time the transaction occurred, previous transactions, and details about the transaction. Once recorded as a block, transactions are ordered chronologically and cannot be altered. This technology rose to popularity after the creation of Bitcoin, the first application of blockchain technology, which has since catalyzed other cryptocurrencies and applications.

The California Consumer Privacy Act (CCPA) is a state statute intended to enhance privacy rights and consumer protection for residents of the state of California in the United States. The bill was passed by the California State Legislature and signed into law by the Governor of California, Jerry Brown, on June 28, 2018, to amend Part 4 of Division 3 of the California Civil Code. Officially called AB-375, the act was introduced by Ed Chau, member of the California State Assembly, and State Senator Robert Hertzberg.

References

  1. "Technology | Academics | Policy – Chris Hoofnagle". www.techpolicy.com. Retrieved April 1, 2021.
  2. Turow, Joseph; King, Jennifer; Hoofnagle, Chris Jay; Bleakley, Amy; Hennessy, Michael (September 29, 2009). "Americans Reject Tailored Advertising and Three Activities that Enable It". Rochester, NY. SSRN   1478214.{{cite journal}}: Cite journal requires |journal= (help)
  3. Clifford, Stephanie (September 29, 2009). "Two-Thirds of Americans Object to Online Tracking". The New York Times. ISSN   0362-4331 . Retrieved April 1, 2021.
  4. Hoofnagle, Chris Jay; King, Jennifer; Li, Su; Turow, Joseph (April 14, 2010). "How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies?". Rochester, NY. SSRN   1589864.{{cite journal}}: Cite journal requires |journal= (help)
  5. "Biography Chris J. Hoofnagle". Gunderson Dettmer. Retrieved April 23, 2021.
  6. Soltani, Ashkan; Canty, Shannon; Mayo, Quentin; Thomas, Lauren; Hoofnagle, Chris Jay (August 10, 2009). "Flash Cookies and Privacy". SSRN   1446862.
  7. Ayenson, Mika D.; Wambach, Dietrich James; Soltani, Ashkan; Good, Nathan; Hoofnagle, Chris Jay (July 29, 2011). "Flash Cookies and Privacy II: Now with HTML5 and ETag Respawning". SSRN   1898390.
  8. Hoofnagle, Chris Jay; Soltani, Ashkan; Good, Nathan; Wambach, Dietrich James; Ayenson, Mika D. (August 28, 2012). "Behavioral Advertising: The Offer You Cannot Refuse". SSRN   2137601.
  9. Chris Hoofnagle's Behavioral Advertising Paper Receives the CPDP 2014 Multidisciplinary Privacy Research Award, TAP Blog, January 23, 2014
  10. 1 2 3 4 5 6 Hoofnagle, Chris Jay (2007–2008). "Identity Theft: Making the Known Unknowns Known". Harvard Journal of Law & Technology. 21: 97.
  11. 1 2 3 Hoofnagle, Chris Jay (2008–2009). "Towards a Market for Bank Safety". Loyola Consumer Law Review. 21: 155.
  12. 1 2 3 Hoofnagle, Chris Jay; Whittington, Jan (2013–2014). "Free: Accounting for the Costs of the Internet's Most Popular Price". UCLA Law Review. 61: 606.
  13. 1 2 3 Whittington, Jan; Hoofnagle, Chris Jay (2011–2012). "Unpacking Privacy's Price". North Carolina Law Review. 90: 1327.
  14. 1 2 Hoofnagle, Chris Jay; Urban, Jennifer M. (2014). "Alan Westin's Privacy Homo Economicus". Wake Forest Law Review. 49: 261.
  15. 1 2 3 4 Hoofnagle, Chris Jay; Soltani, Ashkan; Good, Nathaniel; Wambach, Dietrich J. (2012). "Behavioral Advertising: The Offer You Can't Refuse". Harvard Law & Policy Review. 6: 273.
  16. 1 2 3 4 Hoofnagle, Chris Jay (2003–2004). "Big Brother's Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement". North Carolina Journal of International Law and Commercial Regulation. 29: 595.
  17. Perzanowski, Aaron; Hoofnagle, Chris Jay (2017). "What We Buy when We Buy Now". University of Pennsylvania Law Review. 165 (2): 315–378. ISSN   0041-9907. JSTOR   26600431.
  18. Hoofnagle, Chris Jay (September 1, 2017). "FTC Regulation of Cybersecurity and Surveillance". Rochester, NY. SSRN   3010205.{{cite journal}}: Cite journal requires |journal= (help)
  19. Turow, Joseph; Hoofnagle, Chris Jay; Mulligan, Deirdre K.; Good, Nathaniel (2007–2008). "The Federal Trade Commission and Consumer Privacy in the Coming Decade". I/S: A Journal of Law and Policy for the Information Society. 3: 723.
  20. Hoofnagle, Chris Jay; Sloot, Bart van der; Borgesius, Frederik Zuiderveen (January 2, 2019). "The European Union general data protection regulation: what it is and what it means". Information & Communications Technology Law. 28 (1): 65–98. doi: 10.1080/13600834.2019.1573501 . hdl: 2066/204503 . ISSN   1360-0834.
  21. 1 2 3 4 Solove, Daniel J.; Hoofnagle, Chris Jay (2006). "A Model Regime of Privacy Protection". University of Illinois Law Review. 2006: 357.
  22. Solove, Daniel J.; Hoofnagle, Chris Jay (2006). "A Model Regime of Privacy Protection". University of Illinois Law Review. 2006: 357.
  23. 1 2 3 Hoofnagle, Chris Jay; Sloot, Bart van der; Borgesius, Frederik Zuiderveen (January 2, 2019). "The European Union general data protection regulation: what it is and what it means". Information & Communications Technology Law. 28 (1): 65–98. doi: 10.1080/13600834.2019.1573501 . hdl: 2066/204503 . ISSN   1360-0834.
  24. Hoofnagle, Chris Jay; Urban, Jennifer M.; Li, Su (April 24, 2012). "Mobile Payments: Consumer Benefits & New Privacy Concerns". Rochester, NY. SSRN   2045580.{{cite journal}}: Cite journal requires |journal= (help)
  25. Kesari, Aniket; Hoofnagle, Chris; McCoy, Damon (2017). "Deterring Cybercrime: Focus on Intermediaries". Berkeley Technology Law Journal. 32: 1093.
  26. 1 2 Hoofnagle, Chris Jay; Kesari, Aniket; Perzanowski, Aaron (2019). "The Tethered Economy". George Washington Law Review. 87: 783.