Hard privacy technologies

Last updated

Hard privacy technologies are methods of protecting data. Hard privacy technologies and soft privacy technologies both fall under the category of privacy enchancing technologies. Hard privacy technologies allow online users to protect their privacy through different services and applications without the trust of the third-parties. [1] The data protection goal is data minimization and reduction of the trust in third-parties and the freedom (and techniques) to conceal information or to communicate.

Contents

Applications of hard privacy technologies include onion routing, VPNs and the secret ballot [2] used for democratic elections. [3]

Systems for anonymous communications

Mix networks

Mix networks use both cryptography and permutations to provide anonymity in communications. [4] The combination makes monitoring end-to-end communications more challenging for eavesdroppers, since it breaks the link between the sender and recipients. [5]

Dining Cryptographers Net (DC-net)

DC-net is a protocol for communication that enables secure, uninterrupted communication. [6] Its round-based protocol enables participants to publish one bit message per round unobservably. [7]

The Integrated Services Digital Network (ISDN)

ISDN is based on a digital telecommunications network, i.e. a digital 64 kbit/s channel network. ISDN is primarily used for the swapping of networks; therefore it offers effective service for communication. [8]

Attacks against anonymous communications

In order to cope with attacks on anonymity systems, the traffic analysis would trace information such as who is talking with whom, extract profiles and so on. The traffic analysis is used against vanilla or hardened systems.

Examples of hard privacy technologies

Onion routing

Onion routing is an internet-based encrypted technique to prevent eavesdropping, traffic analysis attacks and so on. Messages in an onion network are embedded in the encryption layers. The destination in each layer will be encrypted. For each router, the message is decrypted by its private key and unveiled like a 'onion' and then the message transmitted to the next router. [9]

Tor is a free-to-use anonymity service that depends on the concept of onion routing. Among all the PETs, tor has one of the highest user bases. [10]

VPNs

A virtual private network (VPN) is one of the most important ways to protect personal information. A VPN connects a private network to a public network, which helps users share information through public networks by extending them to their computer devices. Thus, VPNs users may benefit from more security. [11]

Future of hard privacy technology

The future of hard privacy technology include limited disclosure technology and data protection on US disclosure legislation. [12]

Limited disclosure technology offers a mechanism to preserve individuals' privacy by encouraging them to provide information only a little that is just sufficient to complete an interactionor purchase with service providers. This technology is to restrict the data sharing between consumers and other third parties. [13]

Data protection on US disclosure legislation. [14] Although the United States does not have a general federal legislation on data privacy policy, a range of federal data protection laws are sector-related or focus specific data forms. [15] For example, the Children online privacy protection Act (COPPA) (15 U.S. Code Section 6501) which forbids the collection of any information from a child under the age of 13 years old by internet or by digitally linked devices. [16]  The Video Privacy Protection Act (18 U.S. code § 2710 et seq.) restricts the release of video rental or sale records, including online streaming. [17] At last, the Cable Communications Policy Act of 1984 (47 US Code § 551) protects the subscribers' information privacy. [18]

the LINDDUN methodology

LINDDUN is short for its seven categories of privacy threats including linkability, recognition, non-repudiation, sensitivity, leakage of details, unconscionability and non-compliance. It is used as a privacy threat modeling methodology that supports analysts in systematically eliciting and mitigating privacy threats in software architectures. [19] Its main strength is its combination of methodological guidance and privacy knowledge support. [20]

See also

Related Research Articles

<span class="mw-page-title-main">Privacy</span> Seclusion from unwanted attention

Privacy is the ability of an individual or group to seclude themselves or information about themselves, and thereby express themselves selectively.

<span class="mw-page-title-main">David Chaum</span> American computer scientist and cryptographer

David Lee Chaum is an American computer scientist, cryptographer, and inventor. He is known as a pioneer in cryptography and privacy-preserving technologies, and widely recognized as the inventor of digital cash. His 1982 dissertation "Computer Systems Established, Maintained, and Trusted by Mutually Suspicious Groups" is the first known proposal for a blockchain protocol. Complete with the code to implement the protocol, Chaum's dissertation proposed all but one element of the blockchain later detailed in the Bitcoin whitepaper. He has been referred to as "the father of online anonymity", and "the godfather of cryptocurrency".

<span class="mw-page-title-main">Onion routing</span> Technique for anonymous communication over a computer network

Onion routing is a technique for anonymous communication over a computer network. In an onion network, messages are encapsulated in layers of encryption, analogous to the layers of an onion. The encrypted data is transmitted through a series of network nodes called "onion routers," each of which "peels" away a single layer, revealing the data's next destination. When the final layer is decrypted, the message arrives at its destination. The sender remains anonymous because each intermediary knows only the location of the immediately preceding and following nodes. While onion routing provides a high level of security and anonymity, there are methods to break the anonymity of this technique, such as timing analysis.

Internet security is a branch of computer security. It encompasses the Internet, browser security, web site security, and network security as it applies to other applications or operating systems as a whole. Its objective is to establish rules and measures to use against attacks over the Internet. The Internet is an inherently insecure channel for information exchange, with high risk of intrusion or fraud, such as phishing, online viruses, trojans, ransomware and worms.

A dark net or darknet is an overlay network within the Internet that can only be accessed with specific software, configurations, or authorization, and often uses a unique customized communication protocol. Two typical darknet types are social networks, and anonymity proxy networks such as Tor via an anonymized series of connections.

The Free Haven Project was formed in 1999 by a group of Massachusetts Institute of Technology students with the aim to develop a secure, decentralized system of data storage. The group's work led to a collaboration with the United States Naval Research Laboratory to develop Tor, funded by DARPA.

A privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer or client's data. Personal information can be anything that can be used to identify an individual, not limited to the person's name, address, date of birth, marital status, contact information, ID issue, and expiry date, financial records, credit information, medical history, where one travels, and intentions to acquire goods and services. In the case of a business, it is often a statement that declares a party's policy on how it collects, stores, and releases personal information it collects. It informs the client what specific information is collected, and whether it is kept confidential, shared with partners, or sold to other firms or enterprises. Privacy policies typically represent a broader, more generalized treatment, as opposed to data use statements, which tend to be more detailed and specific.

Pretexting is a type of social engineering attack that involves a situation, or pretext, created by an attacker in order to lure a victim into a vulnerable situation and to trick them into giving private information, specifically information that the victim would typically not give outside the context of the pretext. In its history, pretexting has been described as the first stage of social engineering, and has been used by the FBI to aid in investigations. A specific example of pretexting is reverse social engineering, in which the attacker tricks the victim into contacting the attacker first.

Privacy law is the body of law that deals with the regulating, storing, and using of personally identifiable information, personal healthcare information, and financial information of individuals, which can be collected by governments, public or private organisations, or other individuals. It also applies in the commercial sector to things like trade secrets and the liability that directors, officers, and employees have when handling sensitive information.

An anonymizer or an anonymous proxy is a tool that attempts to make activity on the Internet untraceable. It is a proxy server computer that acts as an intermediary and privacy shield between a client computer and the rest of the Internet. It accesses the Internet on the user's behalf, protecting personal information of the user by hiding the client computer's identifying information. Anonymous proxy is the opposite of transparent proxy, which sends user information in the connection request header.

An Inference Attack is a data mining technique performed by analyzing data in order to illegitimately gain knowledge about a subject or database. A subject's sensitive information can be considered as leaked if an adversary can infer its real value with a high confidence. This is an example of breached information security. An Inference attack occurs when a user is able to infer from trivial information more robust information about a database without directly accessing it. The object of Inference attacks is to piece together information at one security level to determine a fact that should be protected at a higher security level.

Privacy-enhancing technologies (PET) are technologies that embody fundamental data protection principles by minimizing personal data use, maximizing data security, and empowering individuals. PETs allow online users to protect the privacy of their personally identifiable information (PII), which is often provided to and handled by services or applications. PETs use techniques to minimize an information system's possession of personal data without losing functionality. Generally speaking, PETs can be categorized as hard and soft privacy technologies.

<span class="mw-page-title-main">Digital privacy</span>

Digital privacy is often used in contexts that promote advocacy on behalf of individual and consumer privacy rights in e-services and is typically used in opposition to the business practices of many e-marketers, businesses, and companies to collect and use such information and data. Digital privacy can be defined under three sub-related categories: information privacy, communication privacy, and individual privacy.

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995. The privacy by design framework was published in 2009 and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010. Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.

<span class="mw-page-title-main">Network sovereignty</span> Effort to create boundaries on a network

In internet governance, network sovereignty, also called digital sovereignty or cyber sovereignty, is the effort of a governing entity, such as a state, to create boundaries on a network and then exert a form of control, often in the form of law enforcement over such boundaries.

Privacy engineering is an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy.

<span class="mw-page-title-main">Dataveillance</span> Monitoring and collecting online data and metadata

Dataveillance is the practice of monitoring and collecting online data as well as metadata. The word is a portmanteau of data and surveillance. Dataveillance is concerned with the continuous monitoring of users' communications and actions across various platforms. For instance, dataveillance refers to the monitoring of data resulting from credit card transactions, GPS coordinates, emails, social networks, etc. Using digital media often leaves traces of data and creates a digital footprint of our activity. Unlike sousveillance, this type of surveillance is not often known and happens discreetly. Dataveillance may involve the surveillance of groups of individuals. There exist three types of dataveillance: personal dataveillance, mass dataveillance, and facilitative mechanisms.

<span class="mw-page-title-main">Roger Dingledine</span> American computer scientist

Roger Dingledine is an American computer scientist known for having co-founded the Tor Project. A student of mathematics, computer science, and electrical engineering, Dingledine is also known by the pseudonym arma. As of December 2016, he continues in a leadership role with the Tor Project, as a project Leader, Director, and Research Director.

<span class="mw-page-title-main">Yuval Elovici</span>

Yuval Elovici is a computer scientist. He is a professor in the Department of Software and Information Systems Engineering at Ben-Gurion University of the Negev (BGU), where he is the incumbent of the Davide and Irene Sala Chair in Homeland Security Research. He is the director of the Cyber Security Research Center at BGU and the founder and director of the Telekom Innovation Laboratories at Ben-Gurion University. In addition to his roles at BGU, he also serves as the lab director of Singapore University of Technology and Design’s (SUTD) ST Electronics-SUTD Cyber Security Laboratory, as well as the research director of iTrust. In 2014 he co-founded Morphisec, a start-up company, that develops cyber security mechanisms related to moving target defense.

Soft privacy technologies fall under the category of PETs, Privacy-enhancing technologies, as methods of protecting data. Soft privacy is a counterpart to another subcategory of PETs, called hard privacy. Soft privacy technology has the goal of keeping information safe, allowing services to process data while having full control of how data is being used. To accomplish this, soft privacy emphasizes the use of third-party programs to protect privacy, emphasizing auditing, certification, consent, access control, encryption, and differential privacy. Since evolving technologies like the internet, machine learning, and big data are being applied to many long-standing fields, we now need to process billions of datapoints every day in areas such as health care, autonomous cars, smart cards, social media, and more. Many of these fields rely on soft privacy technologies when they handle data.

References

  1. Trepte, Sabine; Reinecke, Leonard, eds. (2001). Privacy Online. doi:10.1007/978-3-642-21521-6. ISBN   978-3-642-21520-9.
  2. Bernhard, Matthew; Benaloh, Josh; Alex Halderman, J.; Rivest, Ronald L.; Ryan, Peter Y. A.; Stark, Philip B.; Teague, Vanessa; Vora, Poorvi L.; Wallach, Dan S. (2017). "Public Evidence from Secret Ballots". Electronic Voting. Lecture Notes in Computer Science. Vol. 10615. pp. 84–109. arXiv: 1707.08619 . doi:10.1007/978-3-319-68687-5_6. ISBN   978-3-319-68686-8. S2CID   34871552.
  3. Deng, Mina; Wuyts, Kim; Scandariato, Riccardo; Preneel, Bart; Joosen, Wouter (2011). "A privacy threat analysis framework: Supporting the elicitation and fulfillment of privacy requirements" (PDF). Requirements Engineering. 16: 3–32. doi:10.1007/s00766-010-0115-7. S2CID   856424.
  4. Sampigethaya, K.; Poovendran, R. (December 2006). "A Survey on Mix Networks and Their Secure Applications" (PDF). Proceedings of the IEEE. 94 (12): 2142–2181. doi:10.1109/JPROC.2006.889687. S2CID   207019876.
  5. Ardagna, Claudio A.; Jajodia, Sushil; Samarati, Pierangela; Stavrou, Angelos (2009). "Privacy Preservation over Untrusted Mobile Networks". In Bettini, Claudio; et al. (eds.). Privacy In Location-Based Applications: Research Issues and Emerging Trends. Lecture Notes in Computer Science. Vol. 5599. Springer. p. 88. Bibcode:2009LNCS.5599...84A. doi:10.1007/978-3-642-03511-1_4. ISBN   978-3-642-03511-1.
  6. Ievgen Verzun. "Secure Dynamic Communication Network And Protocol". Listat Ltd.
  7. Chaum DL (1988). "The dining cryptographers problem: unconditional sender and recipient untraceability". J Cryptol. 1 (1): 65–75. doi:10.1007/BF00206326. S2CID   2664614.
  8. ISDN The Integrated Services Digital Network: Concepts, Methods, Systems. Springer Berlin Heidelberg. 1988. ISBN   978-3-662-08036-8.
  9. "Onion Routing".
  10. Dingledine, Roger; Mathewson, Nick; Syverson, Paul (2004). "Tor: The Second-Generation Onion Router".
  11. Hoa Gia Bao Nguyen (2018). "WIRELESS NETWORK SECURITYA GUIDE FOR SMALL AND MEDIUM PREMISES". Information Technology.
  12. "Do People Know About Privacy and Data Protection Strategies? Towards the "Online Privacy Literacy Scale"". OPLIS. Law, Governance and Technology Series. 20. 2015. doi:10.1007/978-94-017-9385-8. ISBN   978-94-017-9384-1.
  13. Corrales, Marcelo; Jurcys, Paulius; Kousiouris, George (2018). "Smart Contracts and Smart Disclosure: Coding a GDPR Compliance Framework". SSRN Electronic Journal. doi:10.2139/ssrn.3121658.
  14. Hahn, Robert W.; Layne-Farrar, Anne (2001). "The Benefits and Costs Of Online Privacy Legislation". SSRN. doi:10.2139/ssrn.292649. S2CID   167184959.
  15. Cobb, Stephen (2016). "Data privacy and data protection". US Law and Legislation.
  16. Hung, Cho Kiu & Fantinato, Marcelo & Roa, Jorge (2018). Children Privacy Protection. pp. 1–3. doi:10.1007/978-3-319-08234-9_198-1. ISBN   978-3-319-08234-9.{{cite book}}: CS1 maint: multiple names: authors list (link)
  17. Li, Xiangbo; Darwich, Mahmoud; Bayoumi, Magdy (2020). "A Survey on Cloud-Based Video Streaming Services".
  18. Wu, Yanfang; Lau, Tuenyu; Atkin, David J.; Lin, Carolyn A. (2011). "A comparative study of online privacy regulations in the U.S. and China". Telecommunications Policy. 35 (7): 603–616. doi:10.1016/j.telpol.2011.05.002.
  19. Sion, Laurens; Wuyts, Kim; Yskout, Koen; Van Landuyt, Dimitri; Joosen, Wouter (2018). "Interaction-Based Privacy Threat Elicitation". 2018 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW). pp. 79–86. doi:10.1109/EuroSPW.2018.00017. ISBN   978-1-5386-5445-3. S2CID   49655002.
  20. Robles-González, Antonio; Parra-Arnau, Javier; Forné, Jordi (2020). "A LINDDUN-Based framework for privacy threat analysis on identification and authentication processes". Computers & Security. 94: 101755. doi:10.1016/j.cose.2020.101755. hdl: 2117/190711 . S2CID   214007341.