Post-Quantum Cryptography Standardization [1] is a program and competition by NIST to update their standards to include post-quantum cryptography. [2] It was announced at PQCrypto 2016. [3] 23 signature schemes and 59 encryption/KEM schemes were submitted by the initial submission deadline at the end of 2017 [4] of which 69 total were deemed complete and proper and participated in the first round. Seven of these, of which 3 are signature schemes, have advanced to the third round, which was announced on July 22, 2020.
Academic research on the potential impact of quantum computing dates back to at least 2001. [5] A NIST published report from April 2016 cites experts that acknowledge the possibility of quantum technology to render the commonly used RSA algorithm insecure by 2030. [6] As a result, a need to standardize quantum-secure cryptographic primitives was pursued. Since most symmetric primitives are relatively easy to modify in a way that makes them quantum resistant, efforts have focused on public-key cryptography, namely digital signatures and key encapsulation mechanisms. In December 2016 NIST initiated a standardization process by announcing a call for proposals. [7]
The competition is now in its third round out of expected four, where in each round some algorithms are discarded and others are studied more closely. NIST hopes to publish the standardization documents by 2024, but may speed up the process if major breakthroughs in quantum computing are made.
It is currently undecided whether the future standards will be published as FIPS or as NIST Special Publication (SP).
Under consideration were: [8]
(strikethrough means it had been withdrawn)
Type | PKE/KEM | Signature | Signature & PKE/KEM |
---|---|---|---|
Lattice |
|
| |
Code-based |
| ||
Hash-based |
| ||
Multivariate |
|
|
|
Braid group |
| ||
Supersingular elliptic curve isogeny | |||
Satirical submission | |||
Other |
|
|
Candidates moving on to the second round were announced on January 30, 2019. They are: [32]
Type | PKE/KEM | Signature |
---|---|---|
Lattice | ||
Code-based | ||
Hash-based |
| |
Multivariate | ||
Supersingular elliptic curve isogeny | ||
Zero-knowledge proofs |
|
On July 22, 2020, NIST announced seven finalists ("first track"), as well as eight alternate algorithms ("second track"). The first track contains the algorithms which appear to have the most promise, and will be considered for standardization at the end of the third round. Algorithms in the second track could still become part of the standard, after the third round ends. [52] NIST expects some of the alternate candidates to be considered in a fourth round. NIST also suggests it may re-open the signature category for new schemes proposals in the future. [53]
On June 7–9, 2021, NIST conducted the third PQC standardization conference, virtually. [54] The conference included candidates' updates and discussions on implementations, on performances, and on security issues of the candidates. A small amount of focus was spent on intellectual property concerns.
Type | PKE/KEM | Signature |
---|---|---|
Lattice | ||
Code-based |
| |
Multivariate |
Type | PKE/KEM | Signature |
---|---|---|
Lattice |
| |
Code-based | ||
Hash-based |
| |
Multivariate |
| |
Supersingular elliptic curve isogeny | ||
Zero-knowledge proofs |
|
After NIST's announcement regarding the finalists and the alternate candidates, various intellectual property concerns were voiced, notably surrounding lattice-based schemes such as Kyber and NewHope. NIST holds signed statements from submitting groups clearing any legal claims, but there is still a concern that third parties could raise claims. NIST claims that they will take such considerations into account while picking the winning algorithms. [55]
During this round, some candidates have shown to be vulnerable to some attack vectors. It forces these candidates to adapt accordingly:
On July 5, 2022, NIST announced the first group of winners from its six-year competition. [59] [60]
Type | PKE/KEM | Signature |
---|---|---|
Lattice | ||
Hash-based |
On July 5, 2022, NIST announced four candidates for PQC Standardization Round 4. [61]
Type | PKE/KEM |
---|---|
Code-based | |
Supersingular elliptic curve isogeny |
NIST received 50 submissions and deemed 40 to be complete and proper according to the submission requirements. [64] Under consideration are: [65]
(strikethrough means it has been withdrawn)
Type | Signature |
---|---|
Lattice | |
Code-based | |
MPC-in-the-Head | |
Multivariate |
|
Supersingular elliptic curve isogeny |
|
Symmetric-based | |
Other |
|
Elliptic-curve cryptography (ECC) is an approach to public-key cryptography based on the algebraic structure of elliptic curves over finite fields. ECC allows smaller keys compared to non-EC cryptography to provide equivalent security.
The Advanced Encryption Standard (AES), the symmetric block cipher ratified as a standard by National Institute of Standards and Technology of the United States (NIST), was chosen using a process lasting from 1997 to 2000 that was markedly more open and transparent than its predecessor, the Data Encryption Standard (DES). This process won praise from the open cryptographic community, and helped to increase confidence in the security of the winning algorithm from those who were suspicious of backdoors in the predecessor, DES.
Daniel Julius Bernstein is an American mathematician, cryptologist, and computer scientist. He is a visiting professor at CASA at Ruhr University Bochum, as well as a research professor of Computer Science at the University of Illinois at Chicago. Before this, he was a visiting professor in the department of mathematics and computer science at the Eindhoven University of Technology.
NTRU is an open-source public-key cryptosystem that uses lattice-based cryptography to encrypt and decrypt data. It consists of two algorithms: NTRUEncrypt, which is used for encryption, and NTRUSign, which is used for digital signatures. Unlike other popular public-key cryptosystems, it is resistant to attacks using Shor's algorithm. NTRUEncrypt was patented, but it was placed in the public domain in 2017. NTRUSign is patented, but it can be used by software under the GPL.
In cryptography, the McEliece cryptosystem is an asymmetric encryption algorithm developed in 1978 by Robert McEliece. It was the first such scheme to use randomization in the encryption process. The algorithm has never gained much acceptance in the cryptographic community, but is a candidate for "post-quantum cryptography", as it is immune to attacks using Shor's algorithm and – more generally – measuring coset states using Fourier sampling.
Brian A. LaMacchia is a computer security specialist.
Paulo Licciardi Barreto is a Brazilian-American cryptographer and one of the designers of the Whirlpool hash function and the block ciphers Anubis and KHAZAD, together with Vincent Rijmen. He has also co-authored a number of research works on elliptic curve cryptography and pairing-based cryptography, including the eta pairing technique, identity-based cryptographic protocols, and the family of Barreto–Naehrig (BN) and Barreto–Lynn-Scott (BLS) pairing-friendly elliptic curves. More recently he has been focusing his research on post-quantum cryptography, being one of the discoverers of quasi-dyadic codes and quasi-cyclic moderate-density parity-check (QC-MDPC) codes to instantiate the McEliece and Niederreiter cryptosystems and related schemes.
NTRUSign, also known as the NTRU Signature Algorithm, is an NTRU public-key cryptography digital signature algorithm based on the GGH signature scheme. The original version of NTRUSign was Polynomial Authentication and Signature Scheme (PASS), and was published at CrypTEC'99. The improved version of PASS was named as NTRUSign, and was presented at the rump session of Asiacrypt 2001 and published in peer-reviewed form at the RSA Conference 2003. The 2003 publication included parameter recommendations for 80-bit security. A subsequent 2005 publication revised the parameter recommendations for 80-bit security, presented parameters that gave claimed security levels of 112, 128, 160, 192 and 256 bits, and described an algorithm to derive parameter sets at any desired security level. NTRU Cryptosystems, Inc. have applied for a patent on the algorithm.
Multivariate cryptography is the generic term for asymmetric cryptographic primitives based on multivariate polynomials over a finite field . In certain cases those polynomials could be defined over both a ground and an extension field. If the polynomials have the degree two, we talk about multivariate quadratics. Solving systems of multivariate polynomial equations is proven to be NP-complete. That's why those schemes are often considered to be good candidates for post-quantum cryptography. Multivariate cryptography has been very productive in terms of design and cryptanalysis. Overall, the situation is now more stable and the strongest schemes have withstood the test of time. It is commonly admitted that Multivariate cryptography turned out to be more successful as an approach to build signature schemes primarily because multivariate schemes provide the shortest signature among post-quantum algorithms.
In cryptography, the unbalanced oil and vinegar (UOV) scheme is a modified version of the oil and vinegar scheme designed by J. Patarin. Both are digital signature protocols. They are forms of multivariate cryptography. The security of this signature scheme is based on an NP-hard mathematical problem. To create and validate signatures, a minimal quadratic equation system must be solved. Solving m equations with n variables is NP-hard. While the problem is easy if m is either much much larger or much much smaller than n, importantly for cryptographic purposes, the problem is thought to be difficult in the average case when m and n are nearly equal, even when using a quantum computer. Multiple signature schemes have been devised based on multivariate equations with the goal of achieving quantum resistance.
Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice-based constructions support important standards of post-quantum cryptography. Unlike more widely used and known public-key schemes such as the RSA, Diffie-Hellman or elliptic-curve cryptosystems — which could, theoretically, be defeated using Shor's algorithm on a quantum computer — some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
The NIST hash function competition was an open competition held by the US National Institute of Standards and Technology (NIST) to develop a new hash function called SHA-3 to complement the older SHA-1 and SHA-2. The competition was formally announced in the Federal Register on November 2, 2007. "NIST is initiating an effort to develop one or more additional hash algorithms through a public competition, similar to the development process for the Advanced Encryption Standard (AES)." The competition ended on October 2, 2012, when NIST announced that Keccak would be the new SHA-3 hash algorithm.
SHA-3 is the latest member of the Secure Hash Algorithm family of standards, released by NIST on August 5, 2015. Although part of the same series of standards, SHA-3 is internally different from the MD5-like structure of SHA-1 and SHA-2.
Post-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms that are thought to be secure against a cryptanalytic attack by a quantum computer. The problem with popular algorithms currently used in the market is that their security relies on one of three hard mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm or even faster and less demanding alternatives.
BLISS is a digital signature scheme proposed by Léo Ducas, Alain Durmus, Tancrède Lepoint and Vadim Lyubashevsky in their 2013 paper "Lattice Signature and Bimodal Gaussians".
Hash-based cryptography is the generic term for constructions of cryptographic primitives based on the security of hash functions. It is of interest as a type of post-quantum cryptography.
In post-quantum cryptography, NewHope is a key-agreement protocol by Erdem Alkim, Léo Ducas, Thomas Pöppelmann, and Peter Schwabe that is designed to resist quantum computer attacks.
Kyber is a key encapsulation mechanism (KEM) designed to be resistant to cryptanalytic attacks with future powerful quantum computers. It is used to establish a shared secret between two communicating parties without an (IND-CCA2) attacker in the transmission system being able to decrypt it. This asymmetric cryptosystem uses a variant of the learning with errors lattice problem as its basic trapdoor function. It won the NIST competition for the first post-quantum cryptography (PQ) standard. NIST calls its draft standard Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM). However, at least for Kyber512, there are claims that NIST's security calculations were amiss.
Falcon is a post-quantum signature scheme selected by the NIST at the fourth round of the post-quantum standardisation process. It has been designed by Thomas Prest, Pierre-Alain Fouque, Jeffrey Hoffstein, Paul Kirchner, Vadim Lyubashevsky, Thomas Pornin, Thomas Ricosset, Gregor Seiler, William Whyte and Zhenfei Zhang. It relies on the hash-and-sign technique over the Gentry, Peikert and Vaikuntanathan framework over NTRU lattices. The name Falcon is an acronym for Fast Fourier lattice-based compact signatures over NTRU.
Extendable-output function (XOF) is an extension of the cryptographic hash that allows its output to be arbitrarily long. In particular, the sponge construction makes any sponge hash a natural XOF: the squeeze operation can be repeated, and the regular hash functions with a fixed-size result are obtained from a sponge mechanism by stopping the squeezing phase after obtaining the fixed number of bits).