Privacy-enhancing technologies

Last updated

Privacy-enhancing technologies (PET) are technologies that embody fundamental data protection principles by minimizing personal data use, maximizing data security, and empowering individuals. PETs allow online users to protect the privacy of their personally identifiable information (PII), which is often provided to and handled by services or applications. PETs use techniques to minimize an information system's possession of personal data without losing functionality. [1] Generally speaking, PETs can be categorized as either hard or soft privacy technologies. [2]

Contents

Goals of PETs

The objective of PETs is to protect personal data and assure technology users of two key privacy points: their own information is kept confidential, and management of data protection is a priority to the organizations who hold responsibility for any PII. PETs allow users to take one or more of the following actions related to personal data that is sent to and used by online service providers, merchants or other users (this control is known as self-determination). PETs aim to minimize personal data collected and used by service providers and merchants, use pseudonyms or anonymous data credentials to provide anonymity, and strive to achieve informed consent about giving personal data to online service providers and merchants. [3] In Privacy Negotiations, consumers and service providers establish, maintain, and refine privacy policies as individualized agreements through the ongoing choice among service alternatives, therefore providing the possibility to negotiate the terms and conditions of giving personal data to online service providers and merchants (data handling/privacy policy negotiation). Within private negotiations, the transaction partners may additionally bundle the personal information collection and processing schemes with monetary or non-monetary rewards. [4]

PETs provide the possibility to remotely audit the enforcement of these terms and conditions at the online service providers and merchants (assurance), allow users to log, archive and look up past transfers of their personal data, including what data has been transferred, when, to whom and under what conditions, and facilitate the use of their legal rights of data inspection, correction and deletion. PETs also provide the opportunity for consumers or people who want privacy-protection to hide their personal identities. The process involves masking one's personal information and replacing that information with pseudo-data or an anonymous identity.

Families of PETs

Privacy-enhancing Technologies can be distinguished based on their assumptions. [2]

Soft privacy technologies

Soft privacy technologies are used where it can be assumed that a third-party can be trusted for the processing of data. This model is based on compliance, consent, control and auditing. [2]

Example technologies are access control, differential privacy, and tunnel encryption (SSL/TLS).

An example of soft privacy technologies is increased transparency and access. Transparency involves granting people with sufficient details about the rationale used in automated decision-making processes. Additionally, the effort to grant users access is considered soft privacy technology. Individuals are usually unaware of their right of access or they face difficulties in access, such as a lack of a clear automated process. [5]

Hard privacy technologies

With hard privacy technologies, no single entity can violate the privacy of the user. The assumption here is that third-parties cannot be trusted. Data protection goals include data minimization and the reduction of trust in third-parties. [2]

Examples of such technologies include onion routing, the secret ballot, and VPNs [6] used for democratic elections.

Existing PETs

PETs have evolved since their first appearance in the 1980s.[ dubious ] At intervals, review articles have been published on the state of privacy technology:

Example PETs

Examples of existing privacy enhancing technologies are:

Future PETs

Examples of privacy enhancing technologies that are being researched or developed include [20] limited disclosure technology, anonymous credentials, negotiation and enforcement of data handling conditions, and data transaction logs.

Limited disclosure technology provides a way of protecting individuals' privacy by allowing them to share only enough personal information with service providers to complete an interaction or transaction. This technology is also designed to limit tracking and correlation of users’ interactions with these third parties. Limited disclosure uses cryptographic techniques and allows users to retrieve data that is vetted by a provider, to transmit that data to a relying party, and have these relying parties trust the authenticity and integrity of the data. [21]

Anonymous credentials are asserted properties or rights of the credential holder that don't reveal the true identity of the holder; the only information revealed is what the holder of the credential is willing to disclose. The assertion can be issued by the user himself/herself, by the provider of the online service or by a third party (another service provider, a government agency, etc.). For example:

Online car rental. The car rental agency doesn't need to know the true identity of the customer. It only needs to make sure that the customer is over 23 (as an example), that the customer has a drivers license, health insurance (i.e. for accidents, etc.), and that the customer is paying. Thus there is no real need to know the customers name nor their address or any other personal information. Anonymous credentials allow both parties to be comfortable: they allow the customer to only reveal so much data which the car rental agency needs for providing its service (data minimization), and they allow the car rental agency to verify their requirements and get their money. When ordering a car online, the user, instead of providing the classical name, address and credit card number, provides the following credentials, all issued to pseudonyms (i.e. not to the real name of the customer):

Negotiation and enforcement of data handling conditions. Before ordering a product or service online, the user and the online service provider or merchant negotiate the type of personal data that is to be transferred to the service provider. This includes the conditions that shall apply to the handling of the personal data, such as whether or not it may be sent to third parties (profile selling) and under what conditions (e.g. only while informing the user), or at what time in the future it shall be deleted (if at all). After the transfer of personal data took place, the agreed upon data handling conditions are technically enforced by the infrastructure of the service provider, which is capable of managing and processing and data handling obligations. Moreover, this enforcement can be remotely audited by the user, for example by verifying chains of certification based on Trusted computing modules or by verifying privacy seals/labels that were issued by third party auditing organizations (e.g. data protection agencies). Thus instead of the user having to rely on the mere promises of service providers not to abuse personal data, users will be more confident about the service provider adhering to the negotiated data handling conditions [22]

Lastly, the data transaction log allows users the ability to log the personal data they send to service provider(s), the time in which they do it, and under what conditions. These logs are stored and allow users to determine what data they have sent to whom, or they can establish the type of data that is in possession by a specific service provider. This leads to more transparency, which is a pre-requisite of being in control.

See also

Related Research Articles

Identity management (IdM), also known as identity and access management, is a framework of policies and technologies to ensure that the right users have the appropriate access to technology resources. IdM systems fall under the overarching umbrellas of IT security and data management. Identity and access management systems not only identify, authenticate, and control access for individuals who will be utilizing IT resources but also the hardware and applications employees need to access.

Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.

Digital identity is the phrase referring to the data that computer systems use to identify individuals, organizations, applications, or devices. For individuals, it involves the collection of personal data that is essential for facilitating automated access to digital services, confirming one's identity on the internet, and allowing digital systems to manage interactions between different parties. It is a component of a person's social identity in the digital realm, often referred to as their online identity.

An anonymous post, is an entry on a textboard, anonymous bulletin board system, or other discussion forums like Internet forum, without a screen name or more commonly by using a non-identifiable pseudonym. Some online forums such as Slashdot do not allow such posts, requiring users to be registered either under their real name or utilizing a pseudonym. Others like JuicyCampus, AutoAdmit, 2channel, and other Futaba-based imageboards thrive on anonymity. Users of 4chan, in particular, interact in an anonymous and ephemeral environment that facilitates rapid generation of new trends.

Digital credentials are the digital equivalent of paper-based credentials. Just as a paper-based credential could be a passport, a driver's license, a membership certificate or some kind of ticket to obtain some service, such as a cinema ticket or a public transport ticket, a digital credential is a proof of qualification, competence, or clearance that is attached to a person. Also, digital credentials prove something about their owner. Both types of credentials may contain personal information such as the person's name, birthplace, birthdate, and/or biometric information such as a picture or a finger print.

Electronic authentication is the process of establishing confidence in user identities electronically presented to an information system. Digital authentication, or e-authentication, may be used synonymously when referring to the authentication process that confirms or certifies a person's identity and works. When used in conjunction with an electronic signature, it can provide evidence of whether data received has been tampered with after being signed by its original sender. Electronic authentication can reduce the risk of fraud and identity theft by verifying that a person is who they say they are when performing transactions online.

<span class="mw-page-title-main">Information card</span> Personal digital identity for online use

An information card is a personal digital identity that people can use online, and the key component of an identity metasystem. Visually, each i-card has a card-shaped picture and a card name associated with it that enable people to organize their digital identities and to easily select one they want to use for any given interaction. The information card metaphor has been implemented by identity selectors like Windows CardSpace, DigitalMe or Higgins Identity Selector.

The Winston Smith Project is an informational and operational project for the defence of human rights on the Internet and in the digital era. The project was started in 1999 as an anonymous association and it is characterised by the absence of a physical reference identity.

Digital inheritance is the passing down of digital assets to designated beneficiaries after a person’s death as part of the estate of the deceased. The process includes understanding what digital assets exist and navigating the rights for heirs to access and use those digital assets after a person has died.

<span class="mw-page-title-main">Digital privacy</span>

Digital privacy is often used in contexts that promote advocacy on behalf of individual and consumer privacy rights in e-services and is typically used in opposition to the business practices of many e-marketers, businesses, and companies to collect and use such information and data. Digital privacy can be defined under three sub-related categories: information privacy, communication privacy, and individual privacy.

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.

Enhanced Privacy ID (EPID) is Intel Corporation's recommended algorithm for attestation of a trusted system while preserving privacy. It has been incorporated in several Intel chipsets since 2008 and Intel processors since 2011. At RSAC 2016 Intel disclosed that it has shipped over 2.4B EPID keys since 2008. EPID complies with international standards ISO/IEC 20008 / 20009, and the Trusted Computing Group (TCG) TPM 2.0 for authentication. Intel contributed EPID intellectual property to ISO/IEC under RAND-Z terms. Intel is recommending that EPID become the standard across the industry for use in authentication of devices in the Internet of Things (IoT) and in December 2014 announced that it was licensing the technology to third-party chip makers to broadly enable its use.

Data-centric security is an approach to security that emphasizes the dependability of the data itself rather than the security of networks, servers, or applications. Data-centric security is evolving rapidly as enterprises increasingly rely on digital information to run their business and big data projects become mainstream. It involves the separation of data and digital rights management that assign encrypted files to pre-defined access control lists, ensuring access rights to critical and confidential data are aligned with documented business needs and job requirements that are attached to user identities.

Google's changes to its privacy policy on March 16, 2012, enabled the company to share data across a wide variety of services. These embedded services include millions of third-party websites that use AdSense and Analytics. The policy was widely criticized for creating an environment that discourages Internet innovation by making Internet users more fearful and wary of what they do online.

Human rightsandencryption are often viewed as interlinked. Encryption can be a technology that helps implement basic human rights. In the digital age, the freedom of speech has become more controversial; however, from a human rights perspective, there is a growing awareness that encryption is essential for a free, open, and trustworthy Internet.

A blockchain is a shared database that records transactions between two parties in an immutable ledger. Blockchain documents and confirms pseudonymous ownership of all transactions in a verifiable and sustainable way. After a transaction is validated and cryptographically verified by other participants or nodes in the network, it is made into a "block" on the blockchain. A block contains information about the time the transaction occurred, previous transactions, and details about the transaction. Once recorded as a block, transactions are ordered chronologically and cannot be altered. This technology rose to popularity after the creation of Bitcoin, the first application of blockchain technology, which has since catalyzed other cryptocurrencies and applications.

Spatial cloaking is a privacy mechanism that is used to satisfy specific privacy requirements by blurring users’ exact locations into cloaked regions. This technique is usually integrated into applications in various environments to minimize the disclosure of private information when users request location-based service. Since the database server does not receive the accurate location information, a set including the satisfying solution would be sent back to the user. General privacy requirements include K-anonymity, maximum area, and minimum area.

<span class="mw-page-title-main">Self-sovereign identity</span> Type of digital identity

Self-sovereign identity (SSI) is an approach to digital identity that gives individuals control over the information they use to prove who they are to websites, services, and applications across the web. Without SSI, individuals with persistent accounts (identities) across the internet must rely on a number of large identity providers, such as Facebook and Google, that have control of the information associated with their identity. If a user chooses not to use a large identity provider, then they have to create new accounts with each service provider, which fragments their web experiences. Self-sovereign identity offers a way to avoid these two undesirable alternatives. In a self-sovereign identity system, the user accesses services in a streamlined and secure manner, while maintaining control over the information associated with their identity.

Proof of personhood (PoP) is a means of resisting malicious attacks on peer to peer networks, particularly, attacks that utilize multiple fake identities, otherwise known as a Sybil attack. Decentralized online platforms are particularly vulnerable to such attacks by their very nature, as notionally democratic and responsive to large voting blocks. In PoP, each unique human participant obtains one equal unit of voting power, and any associated rewards.

Soft privacy technologies fall under the category of PETs, Privacy-enhancing technologies, as methods of protecting data. Soft privacy is a counterpart to another subcategory of PETs, called hard privacy. Soft privacy technology has the goal of keeping information safe, allowing services to process data while having full control of how data is being used. To accomplish this, soft privacy emphasizes the use of third-party programs to protect privacy, emphasizing auditing, certification, consent, access control, encryption, and differential privacy. Since evolving technologies like the internet, machine learning, and big data are being applied to many long-standing fields, we now need to process billions of datapoints every day in areas such as health care, autonomous cars, smart cards, social media, and more. Many of these fields rely on soft privacy technologies when they handle data.

References

Notes

  1. 1 2 ( van Blarkom, Borking & Olk 2003 )
  2. 1 2 3 4 Deng, Mina; Wuyts, Kim; Scandariato, Riccardo; Preneel, Bart; Joosen, Wouter (2011-03-01). "A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements" (PDF). Requirements Engineering. 16 (1): 332. doi:10.1007/s00766-010-0115-7. ISSN   1432-010X. S2CID   856424. Archived (PDF) from the original on 2017-09-22. Retrieved 2019-12-06.
  3. The EU PRIME research project's Vision on privacy enhanced identity management Archived 2007-10-11 at the Wayback Machine
  4. "Key Facts on Privacy Negotiations". Archived from the original on 2020-04-13. Retrieved 2009-08-08.
  5. D'Acquisto, Giuseppe; Domingo-Ferrer, Josep; Kikiras, Panayiotis; Torra, Vicenç; de Montjoye, Yves-Alexandre; Bourka, Athena (2015). Privacy by design in big data: An overview of privacy enhancing technologies in the era of big data analytics. Publications Office. arXiv: 1512.06000 . doi:10.2824/641480.
  6. "Emotional and Practical Considerations Towards the Adoption and Abandonment of VPNs as a Privacy-Enhancing Technology". Archived from the original on 2024-04-04. Retrieved 2020-10-25.
  7. Pfitzmann, Andreas and Hansen, Marit (2010) A terminology for talking about privacy by data minimization: Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management, v0.34, Report, University of Dresden, http://dud.inf.tu-dresden.de/Anon_Terminology.shtml Archived 2021-02-25 at the Wayback Machine , accessed 09-Dec-2019
  8. Ian Goldberg, David Wagner and Eric Brewer (1997) Privacy-enhancing technologies for the Internet, University of California, Berkeley, https://apps.dtic.mil/dtic/tr/fulltext/u2/a391508.pdf Archived 2021-03-23 at the Wayback Machine , accessed 2019-12-09
  9. Fritsch, Lothar (2007): State of the Art of Privacy-enhancing Technology (PET) - Deliverable D2.1 of the PETweb project; NR Report 1013, Norsk Regnesentral, ISBN   978-82-53-90523-5, 34 pages, https://www.nr.no/publarchive?query=4589 Archived 2020-11-30 at the Wayback Machine , accessed 2019-12-09
  10. Lothar Fritsch, Habtamu Abie: Towards a Research Road Map for the Management of Privacy Risks in Information Systems. Sicherheit 2008: 1-15, Lecture Notes in Informatics vol. 128, http://cs.emis.de/LNI/Proceedings/Proceedings128/P-128.pdf#page=18 Archived 2020-08-06 at the Wayback Machine , accessed 2019-12-09
  11. Heurix, Johannes; Zimmermann, Peter; Neubauer, Thomas; Fenz, Stefan (2015-09-01). "A taxonomy for privacy enhancing technologies". Computers & Security. 53: 1–17. doi:10.1016/j.cose.2015.05.002. ISSN   0167-4048.
  12. Janic, M.; Wijbenga, J. P.; Veugen, T. (June 2013). "Transparency Enhancing Tools (TETs): An Overview". 2013 Third Workshop on Socio-Technical Aspects in Security and Trust. pp. 18–25. doi:10.1109/STAST.2013.11. ISBN   978-0-7695-5065-7. S2CID   14559293.
  13. Murmann, P.; Fischer-Hübner, S. (2017). "Tools for Achieving Usable Ex Post Transparency: A Survey". IEEE Access. 5: 22965–22991. Bibcode:2017IEEEA...522965M. doi: 10.1109/ACCESS.2017.2765539 . ISSN   2169-3536. Archived from the original on 2019-04-30. Retrieved 2024-02-20.
  14. Obfuscation. MIT Press. 4 September 2015. ISBN   9780262029735. Archived from the original on 16 April 2018. Retrieved 2 April 2018.
  15. "TrackMeNot". Archived from the original on 2018-04-05. Retrieved 2018-04-02.
  16. Al-Rfou', Rami; Jannen, William; Patwardhan, Nikhil (2012). "TrackMeNot-so-good-after-all". arXiv: 1211.0320 [cs.IR].
  17. Loui, Ronald (2017). "Plausible Deniability for ISP Log and Browser Suggestion Obfuscation with a Phrase Extractor on Potentially Open Text". 2017 IEEE 15th Intl Conf on Dependable, Autonomic and Secure Computing, 15th Intl Conf on Pervasive Intelligence and Computing, 3rd Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress(DASC/PiCom/DataCom/CyberSciTech). pp. 276–279. doi:10.1109/DASC-PICom-DataCom-CyberSciTec.2017.58. ISBN   978-1-5386-1956-8. S2CID   4567986.
  18. "Enhanced Privacy Id" (PDF). December 2011. Archived (PDF) from the original on 1 February 2017. Retrieved 5 November 2016.
  19. Torre, Lydia F. de la (2019-06-03). "What are Privacy-Enhancing Technologies (PETs)?". Medium. Archived from the original on 2020-10-22. Retrieved 2020-10-20.
  20. The EU PRIME research project's White Paper Archived 2007-08-17 at the Wayback Machine (Version 2)
  21. "Definition of Limited Disclosure Technology - Gartner Information Technology Glossary". Archived from the original on 2015-04-02. Retrieved 2015-03-06.
  22. "Enhancing User Privacy Through Data Handling Policies" (PDF). 2006. Archived (PDF) from the original on 6 November 2016. Retrieved 5 November 2016.

PETs in general:

Anonymous credentials:

Privacy policy negotiation: