P3P

Last updated
P3P
Platform for Privacy Preferences
AbbreviationP3P
Native name
Platform for Privacy Preferences
StatusRetired
First published16 April 2002 (2002-04-16) [1] [2]
Latest version1.1 [2]
CommitteeP3P Specification Working Group [2]
Editors
  • Rigo Wenning [2]
  • Matthias Schunter [2]
Authors
Base standards
Website www.w3.org/TR/P3P11/

The Platform for Privacy Preferences Project (P3P) is an obsolete protocol allowing websites to declare their intended use of information they collect about web browser users. Designed to give users more control of their personal information when browsing, P3P was developed by the World Wide Web Consortium (W3C) and officially recommended on April 16, 2002. Development ceased shortly thereafter and there have been very few implementations of P3P. Internet Explorer and Microsoft Edge were the only major browsers to support P3P. Microsoft has ended support from Windows 10 onwards. Internet Explorer and Edge on Windows 10 no longer support P3P. [3] The president of TRUSTe has stated that P3P has not been implemented widely due to the difficulty and lack of value. [4]

Contents

Purpose

As the World Wide Web became a genuine medium in which to sell products and services, electronic commerce websites tried to collect more information about the people who purchased their merchandise. Some companies used controversial practices such as tracker cookies to ascertain the users' demographic information and buying habits, using this information to provide specifically targeted advertisements. Users who saw this as an invasion of privacy would sometimes turn off HTTP cookies or use proxy servers to keep their personal information secure. P3P was designed to give users a more precise control of the kind of information that they allow to release. According to the W3C, the main goal of P3P "is to increase user trust and confidence in the Web through technical empowerment". [5]

P3P is a machine-readable language that helps to express a website’s data management practices. P3P manages information through privacy policies. When a website used P3P, they set up a set of policies that allows them to state their intended uses of personal information that may be gathered from their site visitors. When a user decided to use P3P, they set their own set of policies and state what personal information they will allow to be seen by the sites that they visit. Then when a user visited a site, P3P will compare what personal information the user is willing to release, and what information the server wants to get – if the two do not match, P3P would inform the user and ask if he/she is willing to proceed to the site, and risk giving up more personal information. [6] As an example, a user may store in the browser preferences that information about their browsing habits should not be collected. If the policy of a Website stated that a cookie is used for this purpose, the browser would automatically reject the cookie. The main content of a privacy policy is the following:

The privacy policy can be retrieved as an XML file or can be included, in compact form, in the HTTP header. The location of the XML policy file that applies to a given document can be:

  1. specified in the HTTP header of the document
  2. specified in the HTML head of the document
  3. if none of the above is specified, the well-known location/w3c/p3p.xml is used (for a similar location compare /favicon.ico )

P3P allows to specify a max-age for caching. A dummy /w3c/p3p.xml file could use this feature:

<METAxmlns="http://www.w3.org/2002/01/P3Pv1"><POLICY-REFERENCES><EXPIRYmax-age="10000000"/><!-- about four months --></POLICY-REFERENCES></META>

User agent support

Yahoo!'s P3P policy as viewed in Internet Explorer 6. IE P3P Policy.png
Yahoo!'s P3P policy as viewed in Internet Explorer 6.

Microsoft's Internet Explorer and Edge were the only mainstream web browsers that supported P3P. [7] Other browsers have not implemented it due to the perceived lack of value it provides. IE provides the ability to display P3P privacy policies, and compare the P3P policy with the browser's settings to decide whether or not to allow cookies from a particular site. However, the P3P functionality in Internet Explorer extends only to cookie blocking, and will not alert the user to an entire web site that violates active privacy preferences. Microsoft considers the feature deprecated in its browsers and totally removed P3P support on Windows 10. [7]

Mozilla supported some P3P features for a few years, but all P3P related source code was removed by 2007. [8]

The Privacy Finder [9] service was also created by Carnegie Mellon's Usable Privacy and Security Laboratory. It is a publicly available "P3P-enabled search engine." A user can enter a search term along with their stated privacy preferences, and is then presented with a list of search results which are ordered based on whether the sites comply with their preferences. This works by crawling the web and maintaining a P3P cache for every site that ever appears in a search query. The cache is updated every 24 hours so that every policy is guaranteed to be relatively up to date. The service also allows users to quickly determine why a site does not comply with their preferences, as well as allowing them to view a dynamically generated natural language privacy policy based on the P3P data. This is advantageous over simply reading the original natural language privacy policy on a web site because many privacy policies are written in legalese and are extremely convoluted. Additionally, in this case the user does not have to visit the web site to read its privacy policy.

Benefits

P3P allows browsers to understand their privacy policies in a simplified and organized manner rather than searching throughout the entire website. By setting privacy settings on a certain level, the user enables P3P to automatically block any cookies that the user might not want on their computer. Additionally, the W3C explains that P3P will allow browsers to transfer user data to services, ultimately promoting an online sharing community.

Additionally, the P3P Toolbox [10] developed by the Internet Education Foundation recommends that anyone who is concerned about increasing their users’ trust and privacy should consider implementing P3P. The P3P toolbox site explains how companies have taken individuals data in order to promote new products or services. Furthermore, in recent years companies have taken individuals information and created profiles, which they then market without the individual's consent. Moreover, all this data is misused and we as consumers pay the price and become worrisome of issues such as: junk mail, identity theft and forms of discrimination; therefore implementing P3P's protocol is good and beneficial for internet browsers.

Moreover, since there has been an increase of browsers there are more users at risk running into privacy problems. But the Internet Education Foundation points out that, “P3P has been developed to help steer the force of technology a step further toward automatic communication of data management practices and individual privacy preferences.” [10]

Criticisms

The Electronic Privacy Information Center (EPIC) has been critical of P3P and believes P3P makes it too difficult for users to protect their privacy. [11] In 2002 it assessed P3P and referred to the technology as a "Pretty Poor Policy". [11] According to EPIC, some P3P software is too complex and difficult for the average person to understand, and many Internet users are unfamiliar with how to use the default P3P software on their computers or how to install additional P3P software. Another concern is that websites are not obligated to use P3P, and neither are Internet users. Moreover, the EPIC website claims that P3Ps protocol would become burdensome for the browser and not as beneficial or efficient as it was intended to be.

A key problem that occurs with the use of P3P is that there is a lack of enforcement. Thus, promises made to users of P3P can go unfulfilled. Though by using P3P a company/website makes a promise of privacy and of the use of gathered data to the site’s users, there are no real legal ramifications if the company decides to use the information for other functions. Currently, there are no actual laws that have been passed by the United States about data protection. Though, ideally, companies should be honest as to their use of customers' personal information, there is no binding reason that the company must actually adhere to the rules it says it will comply by. Though using P3P technically qualifies as a contract, the lack of federal regulation downplays the need for companies to abide. [12]

The agreement to use P3P not only puts in place unenforceable promises, but it also prolongs the adoption of federal laws that would actually inhibit the access and ability to use private information. If the government were to step in and attempt to protect Internet users with federal laws on what information can be accessed, and specific regulations on how user information can be used, companies would not maintain the leeway they do now to use information as they please, despite what they may actually tell users. In 2002, then EPIC employee Chris Hoofnagle argued that P3P was displacing chances for government regulation of privacy. [13]

Critics of P3P also argue that non-compliant sites are excluded. According to a study done by CyLab Privacy Interest Group at Carnegie Mellon University [14] only 15% of the top 5,000 websites incorporate P3P. Therefore, many sites that do not include the code but do practice high privacy standards will not be accessible to users who use P3P as their only online privacy guide.

EPIC also talks about how the development and implementation of P3P can cause a monopoly of private information. Since it tends to be only major companies who implement P3P on their websites, only these major companies are tending to then gather this information seeing as only their privacy policies can compare to privacy preferences of users. The EPIC website says, "The incredible complexity of P3P, combined with the way that popular browsers are likely to implement the protocol would seem to preclude it as a privacy-protective technology," EPIC continues on to state, "Rather, P3P may actually strengthen the monopoly position over personal information that U.S. data marketers now enjoy." [11]

The failure for its immediate adoption can be related to the idea of it being a notice and choice approach that does not comply with the Fair Information Practices. According to the Chairman of the FTC, [15] privacy laws are key in today’s society in order to protect the consumer from providing too much personal information for others’ benefit. Some believe that there should be a limit to the collection and use of the consumer’s personal data online. Currently, sites are not required under any United States laws to comply with the privacy policies they publish, therefore P3P causes some controversy with consumers who are concerned about the release of their personal information and are only able to rely on P3P’s protocol to protect their privacy.

Michael Kaply from IBM is reported saying the following when the Mozilla Foundation was considering the removal of P3P support from their browser-line in 2004: [16]

Ah the memories.

We (IBM) wrote the original P3P implementation and then Netscape proceeded to write their own. So both our companies wasted immense amounts of time that everyone thought was a crappy proposal to begin with.

Remove it.

Live Leer, a PR manager for Opera Software, explained in 2001 the deliberate lack of P3P support in their browser: [17]

At the moment, we aren't sure whether P3P is the best solution. P3P is among the specifications we are considering for support in the future. There have been some issues with how well P3P will protect privacy, and for that reason we have decided to wait until these are resolved.

Alternatives

P3P user agents are not the only option available for Internet users that want to ensure their privacy. Several of the main alternatives to P3P include using web browsers' privacy mode, anonymous e-mailers and anonymous proxy servers.

The main alternative to P3P may not be these technologies, but instead stronger laws to regulate what kind of information from Internet users can be collected and retained by websites. For example, in Europe, the General Data Protection Regulation provides individuals with a certain set of principles about how personal information is collected and the person's rights to protecting their personal data. [18] The act allows individuals to control the type of information that is being collected from them. Various principles are included within the act, such as the rule that individual has the right to retrieve the data collected about them at any time under certain conditions. Moreover, the individual's personal information cannot be kept longer than necessary, and not be used for purposes other than those agreed upon to begin with.

Currently, the United States has no federal law protecting the privacy of personal information shared online. However, there are some sectoral laws at the federal and state level that offer some protection for certain types of information collected about individuals. [19] For example, the Fair Credit Reporting Act (FCRA) of 1970 makes it legal for consumer reporting agencies to disclose personal information only under three specified circumstances: credit, employment or insurance evaluation; government grant or license; or a “legitimate business need” that involves the consumer. A list of other sectoral privacy laws in the United States can be viewed at the Consumer Privacy Guide's website. [19]

The future of P3P

There are many groups who are working to further the future of P3P to make it easier for people to use. Some of these groups are:

Transparent Accountable Datamining Initiative (TAMI) is a group out of MIT’s Computer Science and Artificial Intelligence Laboratory. The goal of TAMI is to create technical, legal, and policy foundations for transparency and accountability in large-scale aggregation. TAMI hopes to help people manage privacy risks in a world where technology is constantly changing.

Policy Aware Web (PAW) is a scalable mechanism for the exchange of rules and proofs for unlimited access control to the Web. “It creates a system of Policy Aware infrastructure using systematic Web rules language with a theorem prover”. [20]

See also

Related Research Articles

<span class="mw-page-title-main">Web browser</span> Software used to navigate the internet

A web browser is an application for accessing websites. When a user requests a web page from a particular website, the browser retrieves its files from a web server and then displays the page on the user's screen. Browsers are used on a range of devices, including desktops, laptops, tablets, and smartphones. In 2020, an estimated 4.9 billion people have used a browser. The most used browser is Google Chrome, with a 65% global market share on all devices, followed by Safari with 18%.

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

Internet privacy involves the right or mandate of personal privacy concerning the storing, re-purposing, provision to third parties, and displaying of information pertaining to oneself via Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance enabled by the emergence of computer technologies.

A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.

<span class="mw-page-title-main">HTTP cookie</span> Small pieces of data stored by a web browser while on a website

HTTP cookies are small blocks of data created by a web server while a user is browsing a website and placed on the user's computer or other device by the user's web browser. Cookies are placed on the device used to access a website, and more than one cookie may be placed on a user's device during a session.

A local shared object (LSO), commonly called a Flash cookie, is a piece of data that websites that use Adobe Flash may store on a user's computer. Local shared objects have been used by all versions of Flash Player since version 6.

A click path or clickstream is the sequence of hyperlinks one or more website visitors follows on a given site, presented in the order viewed. A visitor's click path may start within the website or at a separate third party website, often a search engine results page, and it continues as a sequence of successive webpages visited by the user. Click paths take call data and can match it to ad sources, keywords, and/or referring domains, in order to capture data.

<span class="mw-page-title-main">Targeted advertising</span> Form of advertising

Targeted advertising is a form of advertising, including online advertising, that is directed towards an audience with certain traits, based on the product or person the advertiser is promoting. These traits can either be demographic with a focus on race, economic status, sex, age, generation, level of education, income level, and employment, or psychographic focused on the consumer values, personality, attitude, opinion, lifestyle and interest. This focus can also entail behavioral variables, such as browser history, purchase history, and other recent online activities. The process of algorithm targeting eliminates waste.

In computing, Google Dashboard lets users of the Internet view and manage personal data collected about them by Google. With an account, Google Dashboard allows users to have a summary view of their Google+, Google location history, Google web history, Google Play apps, YouTube and more. Once logged in, it summarizes data for each product the user uses and provides direct links to the products. The program allows setting preferences for personal account products.

Web tracking is the practice by which operators of websites and third parties collect, store and share information about visitors’ activities on the World Wide Web. Analysis of a user's behaviour may be used to provide content that enables the operator to infer their preferences and may be of interest to various parties, such as advertisers. Web tracking can be part of visitor management.

<span class="mw-page-title-main">Evercookie</span> JavaScript application programming interface

Evercookie is a JavaScript application programming interface (API) that identifies and reproduces intentionally deleted cookies on the clients' browser storage. It was created by Samy Kamkar in 2010 to demonstrate the possible infiltration from the websites that use respawning. Websites that have adopted this mechanism can identify users even if they attempt to delete the previously stored cookies.

<span class="mw-page-title-main">Web browsing history</span>

Web browsing history refers to the list of web pages a user has visited, as well as associated metadata such as page title and time of visit. It is usually stored locally by web browsers in order to provide the user with a history list to go back to previously visited pages. It can reflect the user's interests, needs, and browsing habits.

A zombie cookie is a piece of data that could be stored in multiple locations -- since failure of removing all copies of the zombie cookie will make the removal reversible, zombie cookies can be difficult to remove. Since they do not entirely rely on normal cookie protocols, the visitor's web browser may continue to recreate deleted cookies even though the user has opted not to receive cookies.

<span class="mw-page-title-main">Do Not Track</span> HTTP header field proposed in 2009

Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of data derived from that activity outside the context in which it occurred.

<span class="mw-page-title-main">Digital privacy</span>

Digital privacy is often used in contexts that promote advocacy on behalf of individual and consumer privacy rights in e-services and is typically used in opposition to the business practices of many e-marketers, businesses, and companies to collect and use such information and data. Digital privacy can be defined under three sub-related categories: information privacy, communication privacy, and individual privacy.

Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.

<i>United States v. Google Inc.</i>

United States v. Google Inc., No. 3:12-cv-04177, is a case in which the United States District Court for the Northern District of California approved a stipulated order for a permanent injunction and a $22.5 million civil penalty judgment, the largest civil penalty the Federal Trade Commission (FTC) has ever won in history. The FTC and Google Inc. consented to the entry of the stipulated order to resolve the dispute which arose from Google's violation of its privacy policy. In this case, the FTC found Google liable for misrepresenting "privacy assurances to users of Apple's Safari Internet browser". It was reached after the FTC considered that through the placement of advertising tracking cookies in the Safari web browser, and while serving targeted advertisements, Google violated the 2011 FTC's administrative order issued in FTC v. Google Inc.

Google's changes to its privacy policy on March 16, 2012 enabled the company to share data across a wide variety of services. These embedded services include millions of third-party websites that use AdSense and Analytics. The policy was widely criticized for creating an environment that discourages Internet-innovation by making Internet users more fearful and wary of what they do online.

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.

Search engine privacy is a subset of internet privacy that deals with user data being collected by search engines. Both types of privacy fall under the umbrella of information privacy. Privacy concerns regarding search engines can take many forms, such as the ability for search engines to log individual search queries, browsing history, IP addresses, and cookies of users, and conducting user profiling in general. The collection of personally identifiable information (PII) of users by search engines is referred to as "tracking".

References

  1. "The Platform for Privacy Preferences 1.1 (P3P1.1) Specification Publication History - W3C". W3C. Retrieved 2021-04-04.
  2. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 Cranor, Lorrie; Dobbs, Brooks; Egelman, Serge; Hogben, Giles; Humphrey, Jack; Langheinrich, Marc; Marchiori, Massimo; Reagle, Joseph; Schunter, Matthias; Stampley, David A.; Wenning, Rigo. Wenning, Rigo; Matthias, Schunter (eds.). "The Platform for Privacy Preferences 1.1 (P3P1.1) Specification" . Retrieved 2021-04-04.
  3. "P3P is no longer supported". Microsoft Docs. 15 December 2016. Retrieved 8 July 2020.
  4. Richmond, Riva (17 September 2010). "A Loophole Big Enough for a Cookie to Fit Through". The New York Times: Bits. Retrieved 8 July 2020.
  5. "Michael Young – Binäre Optionen – Tipps und Tricks – p3ptoolbox.org". www.p3ptoolbox.org (in German).
  6. "Section 2 What is P3P and How Does it Work?". Archived from the original on 2002-06-12.
  7. 1 2 "Internet Explorer's and Edge's P3P Support". Archived from the original on 2018-01-26. Retrieved 2018-01-25.
  8. Bug 225287 - Remove p3p from the default build
  9. www.privacyfinder.org
  10. 1 2 "Section 1 Why Implement P3P?". Archived from the original on 2002-09-07.
  11. 1 2 3 "Pretty Poor Privacy: An Assessment of P3P and Internet Privacy". Electronic Privacy Information Center. June 2000.
  12. "P3P: Pretty Poor Privacy? By Karen Coyle".
  13. Tech Republic: Despite big-name support, new privacy standard slow to catch on, June 10, 2002
  14. 2006 Privacy Policy Trends Report
  15. Fair Information Practices In The Electronic Marketplace, 2000
  16. "225287 - Remove p3p from the default build".
  17. "P3P: Protector of Consumers' Online Privacy". 17 August 2001.
  18. "Data protection".
  19. 1 2 "ConsumerPrivacyGuide.org | Law Protection". Archived from the original on 2002-02-06. Retrieved 2008-03-08.
  20. W3C P3P site