Ring learning with errors signature

Last updated

Digital signatures are a means to protect digital information from intentional modification and to authenticate the source of digital information. Public key cryptography provides a rich set of different cryptographic algorithms the create digital signatures. However, the primary public key signatures currently in use (RSA and Elliptic Curve Signatures) will become completely insecure if scientists are ever able to build a moderately sized quantum computer. [1] Post quantum cryptography is a class of cryptographic algorithms designed to be resistant to attack by a quantum cryptography. Several post quantum digital signature algorithms based on hard problems in lattices are being created replace the commonly used RSA and elliptic curve signatures. A subset of these lattice based scheme are based on a problem known as Ring learning with errors. Ring learning with errors based digital signatures are among the post quantum signatures with the smallest public key and signature sizes

Contents

Background

Developments in quantum computing over the past decade and the optimistic prospects for real quantum computers within 20 years have begun to threaten the basic cryptography that secures the internet. [2] [3] A relatively small quantum computer capable of processing only ten thousand of bits of information would easily break all of the widely used public key cryptography algorithms used to protect privacy and digitally sign information on the internet. [1] [4]

One of the most widely used public key algorithm used to create digital signatures is known as RSA. Its security is based on the classical difficulty of factoring the product of two large and unknown primes into the constituent primes. The integer factorization problem is believed to be intractable on any conventional computer if the primes are chosen at random and are sufficiently large. However, to factor the product of two n-bit primes, a quantum computer with roughly 6n bits of logical qubit memory and capable of executing a program known as Shor's algorithm will easily accomplish the task. [5] Shor's algorithm can also quickly break digital signatures based on what is known as the discrete logarithm problem and the more esoteric elliptic curve discrete logarithm problem. In effect, a relatively small quantum computer running Shor's algorithm could quickly break all of the digital signatures used to ensure the privacy and integrity of information on the internet today.

Even though we do not know when a quantum computer to break RSA and other digital signature algorithms will exist, there has been active research over the past decade to create cryptographic algorithms which remain secure even when an attacker has the resources of a quantum computer at their disposal. [1] [6] This new area of cryptography is called Post Quantum or Quantum Safe cryptography. [1] [6] This article is about one class of these algorithms: digital signatures based on the Ring Learning with Errors problem. The use of the general Learning with Errors problem in cryptography was introduced by Oded Regev in 2005 and has been the source of several cryptographic designs. [7]

The creators of the Ring-based Learning with Errors (RLWE) basis for cryptography believe that an important feature of these algorithms based on Ring-Learning with Errors is their provable reduction to known hard problems. [8] [9] The signature described below has a provable reduction to the Shortest Vector Problem in an ideal lattice. [10] This means that if an attack can be found on the Ring-LWE cryptosystem then a whole class of presumed hard computational problems will have a solution. [11]

The first RLWE based signature was developed by Lyubashevsky in his paper "Fiat-Shamir with Aborts: Applications to Lattice and Factoring-Based Signatures" [12] and refined in "Lattice Signatures Without Trapdoors" in 2011. [13] A number of refinements and variants have followed. This article highlights the fundamental mathematical structure of RLWE signatures and follows the original Lyubashevsky work and the work of Guneysu, Lyubashevsky and Popplemann (GLP). [10] This presentation is based on a 2017 update to the GLP scheme called GLYPH. [14]

A RLWE-SIG works in the quotient ring of polynomials modulo a degree n polynomial Φ(x) with coefficients in the finite field Zq for an odd prime q ( i.e. the ring Zq[x]/Φ(x) ). [13] Multiplication and addition of polynomials will work in the usual fashion with results of a multiplication reduced mod Φ(x). For this presentation a typical polynomial is expressed as:

The field Zq has its representative elements in the set { -(q-1)/2, ...-1, 0, 1, ... (q-1)/2 }. When n is a power of 2, the polynomial Φ(x) will be the cyclotomic polynomial xn + 1. Other choices of n are possible but the corresponding cyclotomic polynomials are more complicated or their security not as well studied.

Generating "small" polynomials.

A RLWE signature uses polynomials which are considered "small" with respect to a measure called the "infinity norm". The infinity norm for a polynomial is simply the largest absolute value of the coefficients of the polynomial when those coefficients are viewed as integers in Z rather than Zq . [10] The signature algorithm will create random polynomials which are small with respect to a particular infinity norm bound. This is easily done by randomly generating all of the coefficients of the polynomial (a0, ..., an-1) in a manner which is guaranteed or very likely to be less than or equal to this bound. In the literature on Ring Learning with Errors, there are two common ways to do this: [13]

In the RLWE signature GLYPH used as an example below, coefficients for the "small" polynomials will use the uniform sampling method and the value b will be much smaller than the value q. [10]

Hashing to a "small" polynomial

Most RLWE signature algorithms also require the ability to cryptographically hash arbitrary bit strings into small polynomials according to some distribution. The example below uses a hash function, POLYHASH(ω), which accepts a bit string, ω, as input and outputs a polynomial with n coefficients such that exactly k of these coefficients have absolute value greater than zero and less than an integer bound b (see above).

Rejection sampling

A key feature of RLWE signature algorithms is the use of a technique known as rejection sampling. [13] [12] In this technique, if the infinity norm of a signature polynomial exceeds a fixed bound, β, that polynomial will be discarded and the signing process will begin again. This process will be repeated until the infinity norm of the signature polynomial is less than or equal to the bound. Rejection sampling ensures that the output signature is not exploitably correlated with the signer's secret key values.

In the example which follows, the bound, β, will be (b - k), where b is the range of the uniform sampling described above and k will be the number of non-zero coefficients allowed in an "accepted" polynomial [10]

Other parameters

Following GLYPH and as noted above, the maximum degree of the polynomials will be n-1 and therefore have n coefficients. [10] Typical values for n are 512, and 1024. [10] The coefficients of these polynomials will be from the field Fq where q is an odd prime congruent to 1 mod 4. For n=1024, GLYPH sets q = 59393, b=16383 and k the number of non-zero coefficients in the output of Polyhash equal to 16. [14] The number of non-zero coefficients k produced by the hash function is equal to 32 for both cases. [10] The security of the signature scheme is closely tied to the relative sizes of n, q, b, and k. Details on setting these parameters can be found in references 5 and 6 below. [13] [10] [14]

As noted above, the polynomial Φ(x) which defines the ring of polynomials used will be xn + 1. Finally, a(x) will be a randomly chosen and fixed polynomial with coefficients from the set { -(q-1)/2 to (q-1)/2 }. The polynomial a(x) should be chosen in a "Nothing Up My Sleeve" manner such as one-way hashing the output of a true noise random number generator (TRNG) or using the digital expansion of well known mathematical constans such as pi or e. All signers and verifiers of signatures will know n, q, b, k, Φ(x), a(x) and β = b-k.

Public key generation

An entity wishing to sign messages generates its public key through the following steps:

  1. Generate two small polynomials s(x) and e(x) with coefficients chosen uniformly from the set {-b,...-1, 0, 1, ..., b}
  2. Compute t(x) = a(x)·s(x) + e(x)
  3. Distribute t(x) as the entity's public key

The polynomials s(x) and e(x) serve as the private key and t(x) is the corresponding public key. The security of this signature scheme is based on the following problem. Given a polynomial t(x) find small polynomials f1(x) and f2(x) such that: a(x)·f1(x) + f2(x) = t(x)

If this problem is difficult to solve, then the signature scheme will be difficult to forge. [See the Wikipedia article on Ring Learning with Errors or Ideal Lattice Cryptography for more details on the theoretical difficulty of this problem]

Signature generation

Following GLYPH, [14] to sign a message m expressed as a bit string, the signing entity does the following:

  1. Generate two small polynomials y1(x) and y2(x) with coefficients from the set {-b, ..., 0, ...., b}
  2. Compute w(x) = a(x)·y1(x) + y2(x)
  3. Map w(x) into a bit string ω
  4. Compute c(x) = POLYHASH(ω | m) (This is a polynomial with k non-zero coefficients. The "|" denotes concatenation of strings)
  5. Compute z1(x) = s(x)·c(x) + y1(x)
  6. Compute z2(x) = e(x)·c(x) + y2(x)
  7. Until the infinity norms of z1(x) and z2(x) ≤ β = (B - k) go to step 1. (This is the rejection sampling step noted above)
  8. The signature is the triple of polynomials c(x), z1(x) and z2(x)
  9. Transmit the message along with c(x), z1(x) and z2(x) to the verifier.

Signature Verification

Following GLYPH, [14] to verify a message m expressed as a bit string, the verifying entity must possess the signer's public key (t(x)), the signature (c(x), z1(x), z2(x)), and the message m. The verifier does the following:

  1. Verify that the infinity norms of z1(x) and z2(x) ≤ β , if not reject the signature.
  2. Compute w'(x) = a(x)·z1(x) + z2(x) - t(x)c(x)
  3. Map w'(x) into a bit string ω'
  4. Compute c'(x) = POLYHASH(ω' | m)
  5. If c'(x) ≠ c(x) reject the signature, otherwise accept the signature as valid.

Notice that:

a(x)·z1(x) + z2(x) - t(x)c(x) = a(x)·[s(x)·c(x) + y1(x)] + z2(x) - [a(x)·s(x) + e(x)]c(x)

= a(x)·y1(x) + z2(x) - e(x)·c(x)

= a(x)y1(x) + e(x)·c(x) + y2(x) - e(x)·c(x)

= a(x)y1(x) + y2(x) = w(x) (as defined above)

This brief derivation demonstrates that the verification process will have c'(x) = c(x) if the signature was not tampered with.

Further developments

The GLYPH signature scheme described in this document closely follows the work of Lyubashevsky, Gunesyu and Popplemen from 2011 and 2012. There are a number of other variations to their work. These include:

Another approach to signatures based on lattices over Rings is a variant of the patented NTRU family of lattice based cryptography. The primary example of this approach is a signature known as the Bimodal Lattice Signature Scheme (BLISS). It was developed by Ducas, Durmas, Lepoint and Lyubashevsky and documented in their paper "Lattice Signatures and Bimodal Gaussians." [17] See BLISS signature scheme

Related Research Articles

Elliptic-curve cryptography (ECC) is an approach to public-key cryptography based on the algebraic structure of elliptic curves over finite fields. ECC allows smaller keys to provide equivalent security, compared to cryptosystems based on modular exponentiation in Galois fields, such as the RSA cryptosystem and ElGamal cryptosystem.

<span class="mw-page-title-main">Quantum computing</span> Technology that uses quantum mechanics

A quantum computer is a computer that takes advantage of quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior, specifically quantum superposition and entanglement, using specialized hardware that supports the preparation and manipulation of quantum states.

In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. Not being one-to-one is not considered sufficient for a function to be called one-way.

The NTRUEncrypt public key cryptosystem, also known as the NTRU encryption algorithm, is an NTRU lattice-based alternative to RSA and elliptic curve cryptography (ECC) and is based on the shortest vector problem in a lattice.

In computational complexity theory, a computational hardness assumption is the hypothesis that a particular problem cannot be solved efficiently. It is not known how to prove (unconditional) hardness for essentially any useful problem. Instead, computer scientists rely on reductions to formally relate the hardness of a new or complicated problem to a computational hardness assumption about a problem that is better-understood.

In cryptography, the unbalanced oil and vinegar (UOV) scheme is a modified version of the oil and vinegar scheme designed by J. Patarin. Both are digital signature protocols. They are forms of multivariate cryptography. The security of this signature scheme is based on an NP-hard mathematical problem. To create and validate signatures, a minimal quadratic equation system must be solved. Solving m equations with n variables is NP-hard. While the problem is easy if m is either much much larger or much much smaller than n, importantly for cryptographic purposes, the problem is thought to be difficult in the average case when m and n are nearly equal, even when using a quantum computer. Multiple signature schemes have been devised based on multivariate equations with the goal of achieving quantum resistance.

Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice-based constructions support important standards of post-quantum cryptography. Unlike more widely used and known public-key schemes such as the RSA, Diffie-Hellman or elliptic-curve cryptosystems — which could, theoretically, be defeated using Shor's algorithm on a quantum computer — some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.

In computer science, lattice problems are a class of optimization problems related to mathematical objects called lattices. The conjectured intractability of such problems is central to the construction of secure lattice-based cryptosystems: lattice problems are an example of NP-hard problems which have been shown to be average-case hard, providing a test case for the security of cryptographic algorithms. In addition, some lattice problems which are worst-case hard can be used as a basis for extremely secure cryptographic schemes. The use of worst-case hardness in such schemes makes them among the very few schemes that are very likely secure even against quantum computers. For applications in such cryptosystems, lattices over vector spaces or free modules are generally considered.

In cryptography, learning with errors (LWE) is a mathematical problem that is widely used to create secure encryption algorithms. It is based on the idea of representing secret information as a set of equations with errors. In other words, LWE is a way to hide the value of a secret by introducing noise to it. In more technical terms, it refers to the computational problem of inferring a linear -ary function over a finite ring from given samples some of which may be erroneous. The LWE problem is conjectured to be hard to solve, and thus to be useful in cryptography.

In cryptography, SWIFFT is a collection of provably secure hash functions. It is based on the concept of the fast Fourier transform (FFT). SWIFFT is not the first hash function based on FFT, but it sets itself apart by providing a mathematical proof of its security. It also uses the LLL basis reduction algorithm. It can be shown that finding collisions in SWIFFT is at least as difficult as finding short vectors in cyclic/ideal lattices in the worst case. By giving a security reduction to the worst-case scenario of a difficult mathematical problem, SWIFFT gives a much stronger security guarantee than most other cryptographic hash functions.

Post-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms that are thought to be secure against a cryptanalytic attack by a quantum computer. The problem with popular algorithms currently used in the market is that their security relies on one of three hard mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm or even faster and less demanding alternatives.

In discrete mathematics, ideal lattices are a special class of lattices and a generalization of cyclic lattices. Ideal lattices naturally occur in many parts of number theory, but also in other areas. In particular, they have a significant place in cryptography. Micciancio defined a generalization of cyclic lattices as ideal lattices. They can be used in cryptosystems to decrease by a square root the number of parameters necessary to describe a lattice, making them more efficient. Ideal lattices are a new concept, but similar lattice classes have been used for a long time. For example, cyclic lattices, a special case of ideal lattices, are used in NTRUEncrypt and NTRUSign.

In post-quantum cryptography, ring learning with errors (RLWE) is a computational problem which serves as the foundation of new cryptographic algorithms, such as NewHope, designed to protect against cryptanalysis by quantum computers and also to provide the basis for homomorphic encryption. Public-key cryptography relies on construction of mathematical problems that are believed to be hard to solve if no further information is available, but are easy to solve if some information used in the problem construction is known. Some problems of this sort that are currently used in cryptography are at risk of attack if sufficiently large quantum computers can ever be built, so resistant problems are sought. Homomorphic encryption is a form of encryption that allows computation on ciphertext, such as arithmetic on numeric values stored in an encrypted database.

In cryptography, a public key exchange algorithm is a cryptographic algorithm which allows two parties to create and share a secret key, which they can use to encrypt messages between themselves. The ring learning with errors key exchange (RLWE-KEX) is one of a new class of public key exchange algorithms that are designed to be secure against an adversary that possesses a quantum computer. This is important because some public key algorithms in use today will be easily broken by a quantum computer if such computers are implemented. RLWE-KEX is one of a set of post-quantum cryptographic algorithms which are based on the difficulty of solving certain mathematical problems involving lattices. Unlike older lattice based cryptographic algorithms, the RLWE-KEX is provably reducible to a known hard problem in lattices.

Short integer solution (SIS) and ring-SIS problems are two average-case problems that are used in lattice-based cryptography constructions. Lattice-based cryptography began in 1996 from a seminal work by Miklós Ajtai who presented a family of one-way functions based on SIS problem. He showed that it is secure in an average case if the shortest vector problem (where for some constant ) is hard in a worst-case scenario.

BLISS is a digital signature scheme proposed by Léo Ducas, Alain Durmus, Tancrède Lepoint and Vadim Lyubashevsky in their 2013 paper "Lattice Signature and Bimodal Gaussians".

Post-Quantum Cryptography Standardization is a program and competition by NIST to update their standards to include post-quantum cryptography. It was announced at PQCrypto 2016. 23 signature schemes and 59 encryption/KEM schemes were submitted by the initial submission deadline at the end of 2017 of which 69 total were deemed complete and proper and participated in the first round. Seven of these, of which 3 are signature schemes, have advanced to the third round, which was announced on July 22, 2020.

Oded Regev is an Israeli-American theoretical computer scientist and mathematician. He is a professor of computer science at the Courant institute at New York University. He is best known for his work in lattice-based cryptography, and in particular for introducing the learning with errors problem.

In post-quantum cryptography, NewHope is a key-agreement protocol by Erdem Alkim, Léo Ducas, Thomas Pöppelmann, and Peter Schwabe that is designed to resist quantum computer attacks.

Kyber is a key encapsulation mechanism (KEM) designed to be resistant to cryptanalytic attacks with future powerful quantum computers. It is used to establish a shared secret between two communicating parties without an (IND-CCA2) attacker in the transmission system being able to decrypt it. This asymmetric cryptosystem uses a variant of the learning with errors lattice problem as its basic trapdoor function. It won the NIST competition for the first post-quantum cryptography (PQ) standard. NIST calls its draft standard Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM). However, at least for Kyber512, there are claims that NIST's security calculations were amiss.

References

  1. 1 2 3 4 Dahmen-Lhuissier, Sabine. "ETSI - Quantum-Safe Cryptography". ETSI. Retrieved 2015-07-05.
  2. Shah, Agam. "Quantum computing breakthrough claim from IBM". Archived from the original on 2015-09-23. Retrieved 2015-06-01.
  3. Markoff, John (2015-03-04). "Researchers Report Milestone in Developing Quantum Computer". The New York Times. ISSN   0362-4331 . Retrieved 2015-07-05.
  4. Beckman, David; Chari, Amalavoyal N.; Devabhaktuni, Srikrishna; Preskill, John (1996). "Efficient Networks for Quantum Factoring". Physical Review A. 54 (2): 1034–1063. arXiv: quant-ph/9602016 . Bibcode:1996PhRvA..54.1034B. doi:10.1103/PhysRevA.54.1034. ISSN   1050-2947. PMID   9913575. S2CID   2231795.
  5. Smolin, John A.; Smith, Graeme; Vargo, Alexander (July 11, 2013). "Oversimplifying quantum factoring". Nature. 499 (7457): 163–165. arXiv: 1301.7007 . Bibcode:2013Natur.499..163S. doi:10.1038/nature12290. ISSN   0028-0836. PMID   23846653. S2CID   4422110.
  6. 1 2 "Introduction". pqcrypto.org. Retrieved 2015-07-05.
  7. "The Learning with Errors Problem" (PDF). www.cims.nyu.edu. Retrieved 2015-05-24.
  8. Lyubashevsky, Vadim; Peikert, Chris; Regev, Oded (2010). "On Ideal Lattices and Learning with Errors over Rings". In Gilbert, Henri (ed.). Advances in Cryptology – EUROCRYPT 2010. Lecture Notes in Computer Science. Vol. 6110. pp. 1–23. CiteSeerX   10.1.1.297.6108 . doi:10.1007/978-3-642-13190-5_1. ISBN   978-3-642-13189-9.
  9. "What does GCHQ's "cautionary tale" mean for lattice cryptography?". www.cc.gatech.edu. Archived from the original on 2015-07-06. Retrieved 2015-07-05.
  10. 1 2 3 4 5 6 7 8 9 Güneysu, Tim; Lyubashevsky, Vadim; Pöppelmann, Thomas (2012). "Practical Lattice-Based Cryptography: A Signature Scheme for Embedded Systems". In Prouff, Emmanuel; Schaumont, Patrick (eds.). Cryptographic Hardware and Embedded Systems – CHES 2012. Lecture Notes in Computer Science. Vol. 7428. Springer Berlin Heidelberg. pp. 530–547. doi:10.1007/978-3-642-33027-8_31. ISBN   978-3-642-33026-1.
  11. Micciancio, Daniele (1998). "The shortest vector in a lattice is hard to approximate to within some constant". In Proc. 39th Symposium on Foundations of Computer Science: 92–98.
  12. 1 2 Lyubashevsky, Vadim (2009-01-01). "Fiat-Shamir with Aborts: Applications to Lattice and Factoring-Based Signatures". In Matsui, Mitsuru (ed.). Advances in Cryptology – ASIACRYPT 2009. Lecture Notes in Computer Science. Vol. 5912. Springer Berlin Heidelberg. pp. 598–616. doi:10.1007/978-3-642-10366-7_35. ISBN   978-3-642-10365-0.
  13. 1 2 3 4 5 Lyubashevsky, Vadim (2011). "Lattice Signatures Without Trapdoors". Cryptology ePrint Archive.
  14. 1 2 3 4 5 Chopra, Arjun (2017). "GLYPH: A New Instantiation of the GLP Digital Signature Scheme" (PDF). International Association of Cryptographic Research eprint Archive. Archived from the original on 28 August 2017. Retrieved 26 August 2017.{{cite web}}: CS1 maint: bot: original URL status unknown (link)
  15. "Cryptology ePrint Archive: Report 2013/838". eprint.iacr.org. Retrieved 2016-01-17.
  16. "Cryptology ePrint Archive: Report 2015/755". eprint.iacr.org. Retrieved 2016-01-17.
  17. "Cryptology ePrint Archive: Report 2013/383". eprint.iacr.org. Retrieved 2016-01-17.