Vitaly Shmatikov

Last updated
Vitaly Shmatikov
Alma mater Stanford University (PhD)
Scientific career
Fields Computer Security
Institutions Cornell Tech
Thesis Finite-State Analysis of Security Protocols (2000)
Doctoral advisor John C. Mitchell

Vitaly Shmatikov is a professor in computer security at Cornell Tech.

Contents

Biography

Shmatikov obtained his M.S. in engineering-economic systems at Stanford University, and then Ph.D. in computer science from Stanford University in 2000 under the supervision of John C. Mitchell, where he wrote his dissertation on Finite-State Analysis of Security Protocols [1] . He currently has over 100 publications in the area of computer security, privacy, and cryptography. [2]

Research

The Netflix Prize was a competition to predict how users would rate films based on their previous ratings of other films. The dataset contained data from 480,189 users who rated 17,770 movies. In 2008, with Arvind Narayanan, Shmatikov showed that it was possible to de-anonymize individual people from this dataset [3] [4] . To do this, Shmatikov used a second auxiliary dataset of various user's IMDb ratings. By correlating users who made similar reviews, it was possible to nearly perfectly de-identify several people who were present in both datasets. For this work, IEEE awarded Shmatikov the "Test of Time" award in 2019. [5]

More recently Shmatikov has studied the privacy of machine learning. With Reza Shokri, Shmatikov introduced the first "Membership Inference" attacks on machine learning models. This allows an attacker to learn what data a machine learning model has been trained on. [6]

Shmatikov has also studied how machine learning can be used to attack user privacy. For example, in 2016 he showed that "pixelization" used to obfuscate user's faces is insecure and it is still possible to identify people whose faces have been pixelated. [7]

Awards

Shmatikov received the Caspar Bowden PET Award for Outstanding Research in Privacy Enhancing Technologies in 2008, 2014, and 2018. [8] He received the Test of Time award at IEEE S&P 2019, and a Test of Time award from the ACM Conference on Computer and Communications Security [9] . In 2023 he received the Outstanding Paper Award at EMNLP 2023.

Related Research Articles

The Netflix Prize was an open competition for the best collaborative filtering algorithm to predict user ratings for films, based on previous ratings without any other information about the users or films, i.e. without the users being identified except by numbers assigned for the contest.

Private biometrics is a form of encrypted biometrics, also called privacy-preserving biometric authentication methods, in which the biometric payload is a one-way, homomorphically encrypted feature vector that is 0.05% the size of the original biometric template and can be searched with full accuracy, speed and privacy. The feature vector's homomorphic encryption allows search and match to be conducted in polynomial time on an encrypted dataset and the search result is returned as an encrypted match. One or more computing devices may use an encrypted feature vector to verify an individual person or identify an individual in a datastore without storing, sending or receiving plaintext biometric data within or between computing devices or any other entity. The purpose of private biometrics is to allow a person to be identified or authenticated while guaranteeing individual privacy and fundamental human rights by only operating on biometric data in the encrypted space. Some private biometrics including fingerprint authentication methods, face authentication methods, and identity-matching algorithms according to bodily features. Private biometrics are constantly evolving based on the changing nature of privacy needs, identity theft, and biotechnology.

A device fingerprint or machine fingerprint is information collected about the software and hardware of a remote computing device for the purpose of identification. The information is usually assimilated into a brief identifier using a fingerprinting algorithm. A browser fingerprint is information collected specifically by interaction with the web browser of the device.

Synthetic data is information that is artificially generated rather than produced by real-world events. Typically created using algorithms, synthetic data can be deployed to validate mathematical models and to train machine learning models.

In computer science, robustness is the ability of a computer system to cope with errors during execution and cope with erroneous input. Robustness can encompass many areas of computer science, such as robust programming, robust machine learning, and Robust Security Network. Formal techniques, such as fuzz testing, are essential to showing robustness since this type of testing involves invalid or unexpected inputs. Alternatively, fault injection can be used to test robustness. Various commercial products perform robustness testing of software analysis.

<span class="mw-page-title-main">Evercookie</span> JavaScript application programming interface

Evercookie is a JavaScript application programming interface (API) that identifies and reproduces intentionally deleted cookies on the clients' browser storage. It was created by Samy Kamkar in 2010 to demonstrate the possible infiltration from the websites that use respawning. Websites that have adopted this mechanism can identify users even if they attempt to delete the previously stored cookies.

Dawn Song is a Chinese American academic and is a professor at the University of California, Berkeley, in the Electrical Engineering and Computer Science Department.

<span class="mw-page-title-main">Helen Nissenbaum</span>

Helen Nissenbaum is professor of information science at Cornell Tech. She is best known for the concept of "contextual integrity" and her work on privacy, privacy law, trust, and security in the online world. Specifically, contextual integrity has influenced the United States government's thinking about privacy issues.

Quasi-identifiers are pieces of information that are not of themselves unique identifiers, but are sufficiently well correlated with an entity that they can be combined with other quasi-identifiers to create a unique identifier.

k-anonymity is a property possessed by certain anonymized data. The term k-anonymity was first introduced by Pierangela Samarati and Latanya Sweeney in a paper published in 1998, although the concept dates to a 1986 paper by Tore Dalenius.

<span class="mw-page-title-main">Arvind Narayanan</span> Associate professor of computer science

Arvind Narayanan is a computer scientist and a professor at Princeton University. Narayanan is recognized for his research in the de-anonymization of data.

Canvas fingerprinting is one of a number of browser fingerprinting techniques for tracking online users that allow websites to identify and track visitors using the HTML5 canvas element instead of browser cookies or other similar means. The technique received wide media coverage in 2014 after researchers from Princeton University and KU Leuven University described it in their paper The Web never forgets.

Adversarial machine learning is the study of the attacks on machine learning algorithms, and of the defenses against such attacks. A survey from May 2020 exposes the fact that practitioners report a dire need for better protecting machine learning systems in industrial applications.

Usability of web authentication systems refers to the efficiency and user acceptance of online authentication systems. Examples of web authentication systems are passwords, federated identity systems, email-based single sign-on (SSO) systems, QR code-based systems or any other system used to authenticate a user's identity on the web. Even though the usability of web authentication systems should be a key consideration in selecting a system, very few web authentication systems have been subjected to formal usability studies or analysis.

Data re-identification or de-anonymization is the practice of matching anonymous data with publicly available information, or auxiliary data, in order to discover the person the data belong to. This is a concern because companies with privacy policies, health care providers, and financial institutions may release the data they collect after the data has gone through the de-identification process.

Yaniv Erlich is an Israeli-American scientist. He formerly served as an Associate Professor of Computer Science at Columbia University and was the Chief Science Officer of MyHeritage. Erlich's work combines computer science and genomics.

Since the advent of differential privacy, a number of systems supporting differentially private data analyses have been implemented and deployed. This article tracks real-world deployments, production software packages, and research prototypes.

NordPass is a proprietary password manager launched in 2019. It is meant to help its users to organise their passwords and secure notes, keeping them in a single encrypted password vault. This service comes in both free and premium versions, though the free version lacks much of the paid functionality like multi-device login. NordPass was developed by the same cybersecurity team that created NordVPN, a VPN service provider.

<span class="mw-page-title-main">Jean-Pierre Hubaux</span> Swiss-Belgian computer scientist spezialised in security and privacy

Jean-Pierre Hubaux is a Swiss-Belgian computer scientist specialised in security and privacy. He is a professor of computer science at EPFL and is the head of the Laboratory for Data Security at EPFL's School of Computer and Communication Sciences.

Hovav Shacham is a professor in computer security at the University of Texas at Austin. He has made many advances to both cryptography and computer security.

References

  1. "Vitaly Shmatikov - The Mathematics Genealogy Project". www.mathgenealogy.org. Retrieved 2024-02-28.
  2. "Vitaly Shmatikov". scholar.google.com. Retrieved 2024-02-28.
  3. Schneier, Bruce. "Why 'Anonymous' Data Sometimes Isn't". Wired. ISSN   1059-1028 . Retrieved 2024-02-28.
  4. Singer, Natasha (2015-01-29). "With a Few Bits of Data, Researchers Identify 'Anonymous' People". Bits Blog. Retrieved 2024-02-28.
  5. "Vitaly Shmatikov and Arvind Narayanan Receive Inaugural "Test of Time" Award from the IEEE Symposium on Security and Privacy | Department of Computer Science". www.cs.cornell.edu. Retrieved 2024-02-28.
  6. "Artificial intelligence may put private data at risk | Cornell Chronicle". news.cornell.edu. Retrieved 2024-02-28.
  7. Newman, Lily Hay. "AI Can Recognize Your Face Even If You're Pixelated". Wired. ISSN   1059-1028 . Retrieved 2024-02-28.
  8. "PET Award". petsymposium.org. Retrieved 2024-02-28.
  9. "Shmatikov". dli-cornell-tech. Retrieved 2024-02-28.