In post-quantum cryptography, ring learning with errors (RLWE) is a computational problem which serves as the foundation of new cryptographic algorithms, such as NewHope, designed to protect against cryptanalysis by quantum computers and also to provide the basis for homomorphic encryption. Public-key cryptography relies on construction of mathematical problems that are believed to be hard to solve if no further information is available, but are easy to solve if some information used in the problem construction is known. Some problems of this sort that are currently used in cryptography are at risk of attack if sufficiently large quantum computers can ever be built, so resistant problems are sought.
RLWE is more properly called learning with errors over rings and is simply the larger learning with errors (LWE) problem specialized to polynomial rings over finite fields. [1] Because of the presumed difficulty of solving the RLWE problem even on a quantum computer, RLWE based cryptography may form the fundamental base for public-key cryptography in the future just as the integer factorization and discrete logarithm problem have served as the base for public key cryptography since the early 1980s. [2] An important feature of basing cryptography on the ring learning with errors problem is the fact that the solution to the RLWE problem can be used to solve a version of the shortest vector problem (SVP) in a lattice (a polynomial-time reduction from this SVP problem to the RLWE problem has been presented [1] ).
The security of modern cryptography, in particular public-key cryptography, is based on the assumed intractability of solving certain computational problems if the size of the problem is large enough and the instance of the problem to be solved is chosen randomly. The classic example that has been used since the 1970s is the integer factorization problem. It is believed that it is computationally intractable to factor the product of two prime numbers if those prime numbers are large enough and chosen at random. [3] As of 2015 research has led to the factorization of the product of two 384-bit primes but not the product of two 512-bit primes. Integer factorization forms the basis of the widely used RSA cryptographic algorithm.
The ring learning with errors (RLWE) problem is built on the arithmetic of polynomials with coefficients from a finite field. [1] A typical polynomial is expressed as:
Polynomials can be added and multiplied in the usual fashion. In the RLWE context the coefficients of the polynomials and all operations involving those coefficients will be done in a finite field, typically the field for a prime integer . The set of polynomials over a finite field with the operations of addition and multiplication forms an infinite polynomial ring (). The RLWE context works with a finite quotient ring of this infinite ring. The quotient ring is typically the finite quotient (factor) ring formed by reducing all of the polynomials in modulo an irreducible polynomial . This finite quotient ring can be written as though many authors write . [1]
If the degree of the polynomial is , the quotient ring becomes the ring of polynomials of degree less than modulo with coefficients from . The values , , together with the polynomial partially define the mathematical context for the RLWE problem.
Another concept necessary for the RLWE problem is the idea of "small" polynomials with respect to some norm. The typical norm used in the RLWE problem is known as the infinity norm (also called the uniform norm). The infinity norm of a polynomial is simply the largest coefficient of the polynomial when these coefficients are viewed as integers. Hence, states that the infinity norm of the polynomial is . Thus is the largest coefficient of .
The final concept necessary to understand the RLWE problem is the generation of random polynomials in and the generation of "small" polynomials . A random polynomial is easily generated by simply randomly sampling the coefficients of the polynomial from , where is typically represented as the set .
Randomly generating a "small" polynomial is done by generating the coefficients of the polynomial from in a way that either guarantees or makes very likely small coefficients. When is a prime integer, there are two common ways to do this:
The RLWE problem can be stated in two different ways: a "search" version and a "decision" version. Both begin with the same construction. Let
The Search version entails finding the unknown polynomial given the list of polynomial pairs .
The Decision version of the problem can be stated as follows. Given a list of polynomial pairs , determine whether the polynomials were constructed as or were generated randomly from with coefficients from all of .
The difficulty of this problem is parameterized by the choice of the quotient polynomial (), its degree (), the field (), and the smallness bound (). In many RLWE based public key algorithms the private key will be a pair of small polynomials and . The corresponding public key will be a pair of polynomials , selected randomly from , and the polynomial . Given and , it should be computationally infeasible to recover the polynomial .
In cases where the polynomial is a cyclotomic polynomial, the difficulty of solving the search version of RLWE problem is equivalent to finding a short vector (but not necessarily the shortest vector) in an ideal lattice formed from elements of represented as integer vectors. [1] This problem is commonly known as the Approximate Shortest Vector Problem (α-SVP) and it is the problem of finding a vector shorter than α times the shortest vector. The authors of the proof for this equivalence write:
In that quote, The ring is and the ring is .
The α-SVP in regular lattices is known to be NP-hard due to work by Daniele Micciancio in 2001, although not for values of α required for a reduction to general learning with errors problem. [5] However, there is not yet a proof to show that the difficulty of the α-SVP for ideal lattices is equivalent to the average α-SVP. Rather we have a proof that if there are any α-SVP instances that are hard to solve in ideal lattices then the RLWE Problem will be hard in random instances. [1]
Regarding the difficulty of Shortest Vector Problems in Ideal Lattices, researcher Michael Schneider writes, "So far there is no SVP algorithm making use of the special structure of ideal lattices. It is widely believed that solving SVP (and all other lattice problems) in ideal lattices is as hard as in regular lattices." [6] The difficulty of these problems on regular lattices is provably NP-hard. [5] There are, however, a minority of researchers who do not believe that ideal lattices share the same security properties as regular lattices. [7]
Peikert believes that these security equivalences make the RLWE problem a good basis for future cryptography. He writes: "There is a mathematical proof that theonlyway to break the cryptosystem (within some formal attack model) on its random instances is by being able to solve the underlying lattice problem in theworst case" (emphasis in the original). [8]
A major advantage that RLWE based cryptography has over the original learning with errors (LWE) based cryptography is found in the size of the public and private keys. RLWE keys are roughly the square root of keys in LWE. [1] For 128 bits of security an RLWE cryptographic algorithm would use public keys around 7000 bits in length. [9] The corresponding LWE scheme would require public keys of 49 million bits for the same level of security. [1] [ failed verification ] On the other hand, RLWE keys are larger than the keys sizes for currently used public key algorithms like RSA and Elliptic Curve Diffie-Hellman which require public key sizes of 3072 bits and 256 bits, respectively, to achieve a 128-bit level of security. From a computational standpoint, however, RLWE algorithms have been shown to be the equal of or better than existing public key systems. [10]
Three groups of RLWE cryptographic algorithms exist:
The fundamental idea of using LWE and Ring LWE for key exchange was proposed and filed at the University of Cincinnati in 2011 by Jintai Ding. The basic idea comes from the associativity of matrix multiplications, and the errors are used to provide the security. The paper [11] appeared in 2012 after a provisional patent application was filed in 2012.
In 2014, Peikert [12] presented a key transport scheme following the same basic idea of Ding's, where the new idea of sending additional 1 bit signal for rounding in Ding's construction is also utilized. An RLWE version of the classic MQV variant of a Diffie-Hellman key exchange was later published by Zhang et al. [13] The security of both key exchanges is directly related to the problem of finding approximate short vectors in an ideal lattice.
A RLWE version of the classic Feige–Fiat–Shamir Identification protocol was created and converted to a digital signature in 2011 by Lyubashevsky. [14] The details of this signature were extended in 2012 by Gunesyu, Lyubashevsky, and Popplemann in 2012 and published in their paper "Practical Lattice Based Cryptography – A Signature Scheme for Embedded Systems." [15] These papers laid the groundwork for a variety of recent signature algorithms some based directly on the ring learning with errors problem and some which are not tied to the same hard RLWE problems. [16]
Homomorphic encryption is type of encryption that allows computations to be performed on encrypted data without first having to decrypt it. The purpose of homomorphic encryption is to allow the computations on sensitive data to occur on computing devices that should not be trusted with the data. These computing devices are allowed to process the ciphertext which is output from a homomorphic encryption. In 2011, Brakersky and Vaikuntanathan, published "Fully Homomorphic Encryption from Ring-LWE and Security for Key Dependent Messages" which builds a homomorphic encryption scheme directly on the RLWE problem. [17]
Shor's algorithm is a quantum algorithm for finding the prime factors of an integer. It was developed in 1994 by the American mathematician Peter Shor. It is one of the few known quantum algorithms with compelling potential applications and strong evidence of superpolynomial speedup compared to best known classical (non-quantum) algorithms. On the other hand, factoring numbers of practical significance requires far more qubits than available in the near future. Another concern is that noise in quantum circuits may undermine results, requiring additional qubits for quantum error correction.
In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. Not being one-to-one is not considered sufficient for a function to be called one-way.
In geometry and group theory, a lattice in the real coordinate space is an infinite set of points in this space with the properties that coordinate-wise addition or subtraction of two points in the lattice produces another lattice point, that the lattice points are all separated by some minimum distance, and that every point in the space is within some maximum distance of a lattice point. Closure under addition and subtraction means that a lattice must be a subgroup of the additive group of the points in the space, and the requirements of minimum and maximum distance can be summarized by saying that a lattice is a Delone set. More abstractly, a lattice can be described as a free abelian group of dimension which spans the vector space . For any basis of , the subgroup of all linear combinations with integer coefficients of the basis vectors forms a lattice, and every lattice can be formed from a basis in this way. A lattice may be viewed as a regular tiling of a space by a primitive cell.
The Paillier cryptosystem, invented by and named after Pascal Paillier in 1999, is a probabilistic asymmetric algorithm for public key cryptography. The problem of computing n-th residue classes is believed to be computationally difficult. The decisional composite residuosity assumption is the intractability hypothesis upon which this cryptosystem is based.
The NTRUEncrypt public key cryptosystem, also known as the NTRU encryption algorithm, is an NTRU lattice-based alternative to RSA and elliptic curve cryptography (ECC) and is based on the shortest vector problem in a lattice.
The Lenstra–Lenstra–Lovász (LLL) lattice basis reduction algorithm is a polynomial time lattice reduction algorithm invented by Arjen Lenstra, Hendrik Lenstra and László Lovász in 1982. Given a basis with n-dimensional integer coordinates, for a lattice L with , the LLL algorithm calculates an LLL-reduced lattice basis in time where is the largest length of under the Euclidean norm, that is, .
Homomorphic encryption is a form of encryption that allows computations to be performed on encrypted data without first having to decrypt it. The resulting computations are left in an encrypted form which, when decrypted, result in an output that is identical to that produced had the operations been performed on the unencrypted data. Homomorphic encryption can be used for privacy-preserving outsourced storage and computation. This allows data to be encrypted and outsourced to commercial cloud environments for processing, all while encrypted.
In computational complexity theory, a computational hardness assumption is the hypothesis that a particular problem cannot be solved efficiently. It is not known how to prove (unconditional) hardness for essentially any useful problem. Instead, computer scientists rely on reductions to formally relate the hardness of a new or complicated problem to a computational hardness assumption about a problem that is better-understood.
CEILIDH is a public key cryptosystem based on the discrete logarithm problem in algebraic torus. This idea was first introduced by Alice Silverberg and Karl Rubin in 2003; Silverberg named CEILIDH after her cat. The main advantage of the system is the reduced size of the keys for the same security over basic schemes.
Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice-based constructions support important standards of post-quantum cryptography. Unlike more widely used and known public-key schemes such as the RSA, Diffie-Hellman or elliptic-curve cryptosystems — which could, theoretically, be defeated using Shor's algorithm on a quantum computer — some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
In computer science, lattice problems are a class of optimization problems related to mathematical objects called lattices. The conjectured intractability of such problems is central to the construction of secure lattice-based cryptosystems: lattice problems are an example of NP-hard problems which have been shown to be average-case hard, providing a test case for the security of cryptographic algorithms. In addition, some lattice problems which are worst-case hard can be used as a basis for extremely secure cryptographic schemes. The use of worst-case hardness in such schemes makes them among the very few schemes that are very likely secure even against quantum computers. For applications in such cryptosystems, lattices over vector spaces or free modules are generally considered.
In mathematics the Function Field Sieve is one of the most efficient algorithms to solve the Discrete Logarithm Problem (DLP) in a finite field. It has heuristic subexponential complexity. Leonard Adleman developed it in 1994 and then elaborated it together with M. D. Huang in 1999. Previous work includes the work of D. Coppersmith about the DLP in fields of characteristic two.
In cryptography, learning with errors (LWE) is a mathematical problem that is widely used to create secure encryption algorithms. It is based on the idea of representing secret information as a set of equations with errors. In other words, LWE is a way to hide the value of a secret by introducing noise to it. In more technical terms, it refers to the computational problem of inferring a linear -ary function over a finite ring from given samples some of which may be erroneous. The LWE problem is conjectured to be hard to solve, and thus to be useful in cryptography.
Post-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms that are currently thought to be secure against a cryptanalytic attack by a quantum computer. Most widely-used public-key algorithms rely on the difficulty of one of three mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm or even faster and less demanding alternatives.
In discrete mathematics, ideal lattices are a special class of lattices and a generalization of cyclic lattices. Ideal lattices naturally occur in many parts of number theory, but also in other areas. In particular, they have a significant place in cryptography. Micciancio defined a generalization of cyclic lattices as ideal lattices. They can be used in cryptosystems to decrease by a square root the number of parameters necessary to describe a lattice, making them more efficient. Ideal lattices are a new concept, but similar lattice classes have been used for a long time. For example, cyclic lattices, a special case of ideal lattices, are used in NTRUEncrypt and NTRUSign.
Supersingular isogeny Diffie–Hellman key exchange is an insecure proposal for a post-quantum cryptographic algorithm to establish a secret key between two parties over an untrusted communications channel. It is analogous to the Diffie–Hellman key exchange, but is based on walks in a supersingular isogeny graph and was designed to resist cryptanalytic attack by an adversary in possession of a quantum computer. Before it was broken, SIDH boasted one of the smallest key sizes of all post-quantum key exchanges; with compression, SIDH used 2688-bit public keys at a 128-bit quantum security level. SIDH also distinguishes itself from similar systems such as NTRU and Ring-LWE by supporting perfect forward secrecy, a property that prevents compromised long-term keys from compromising the confidentiality of old communication sessions. These properties seemed to make SIDH a natural candidate to replace Diffie–Hellman (DHE) and elliptic curve Diffie–Hellman (ECDHE), which are widely used in Internet communication. However, SIDH is vulnerable to a devastating key-recovery attack published in July 2022 and is therefore insecure. The attack does not require a quantum computer.
Digital signatures are a means to protect digital information from intentional modification and to authenticate the source of digital information. Public key cryptography provides a rich set of different cryptographic algorithms the create digital signatures. However, the primary public key signatures currently in use will become completely insecure if scientists are ever able to build a moderately sized quantum computer. Post quantum cryptography is a class of cryptographic algorithms designed to be resistant to attack by a quantum cryptography. Several post quantum digital signature algorithms based on hard problems in lattices are being created replace the commonly used RSA and elliptic curve signatures. A subset of these lattice based scheme are based on a problem known as Ring learning with errors. Ring learning with errors based digital signatures are among the post quantum signatures with the smallest public key and signature sizes
In cryptography, a public key exchange algorithm is a cryptographic algorithm which allows two parties to create and share a secret key, which they can use to encrypt messages between themselves. The ring learning with errors key exchange (RLWE-KEX) is one of a new class of public key exchange algorithms that are designed to be secure against an adversary that possesses a quantum computer. This is important because some public key algorithms in use today will be easily broken by a quantum computer if such computers are implemented. RLWE-KEX is one of a set of post-quantum cryptographic algorithms which are based on the difficulty of solving certain mathematical problems involving lattices. Unlike older lattice based cryptographic algorithms, the RLWE-KEX is provably reducible to a known hard problem in lattices.
Short integer solution (SIS) and ring-SIS problems are two average-case problems that are used in lattice-based cryptography constructions. Lattice-based cryptography began in 1996 from a seminal work by Miklós Ajtai who presented a family of one-way functions based on SIS problem. He showed that it is secure in an average case if the shortest vector problem (where for some constant ) is hard in a worst-case scenario.
HEAAN is an open source homomorphic encryption (HE) library which implements an approximate HE scheme proposed by Cheon, Kim, Kim and Song (CKKS). The first version of HEAAN was published on GitHub on 15 May 2016, and later a new version of HEAAN with a bootstrapping algorithm was released. Currently, the latest version is Version 2.1.
This paper gives Las Vegas algorithms for finding discrete logarithms and factoring integers on a quantum computer that take a number of steps which is polynomial in the input size, e.g., the number of digits of the integer to be factored. These two problems are generally considered hard on a classical computer and have been used as the basis of several proposed cryptosystems.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)