In discrete mathematics, ideal lattices are a special class of lattices and a generalization of cyclic lattices. [1] Ideal lattices naturally occur in many parts of number theory, but also in other areas. In particular, they have a significant place in cryptography. Micciancio defined a generalization of cyclic lattices as ideal lattices. They can be used in cryptosystems to decrease by a square root the number of parameters necessary to describe a lattice, making them more efficient. Ideal lattices are a new concept, but similar lattice classes have been used for a long time. For example, cyclic lattices, a special case of ideal lattices, are used in NTRUEncrypt and NTRUSign.
Ideal lattices also form the basis for quantum computer attack resistant cryptography based on the Ring Learning with Errors. [2] These cryptosystems are provably secure under the assumption that the shortest vector problem (SVP) is hard in these ideal lattices.
In general terms, ideal lattices are lattices corresponding to ideals in rings of the form for some irreducible polynomial of degree . [1] All of the definitions of ideal lattices from prior work are instances of the following general notion: let be a ring whose additive group is isomorphic to (i.e., it is a free -module of rank ), and let be an additive isomorphism mapping to some lattice in an -dimensional real vector space (e.g., ). The family of ideal lattices for the ring under the embedding is the set of all lattices , where is an ideal in [3]
Let be a monic polynomial of degree , and consider the quotient ring .
Using the standard set of representatives , and identification of polynomials with vectors, the quotient ring is isomorphic (as an additive group) to the integer lattice , and any ideal defines a corresponding integer sublattice .
An ideal lattice is an integer lattice such that for some monic polynomial of degree and ideal .
It turns out that the relevant properties of for the resulting function to be collision resistant are:
The first property implies that every ideal of the ring defines a full-rank lattice in and plays a fundamental role in proofs.
Lemma: Every ideal of , where is a monic, irreducible integer polynomial of degree , is isomorphic to a full-rank lattice in .
Ding and Lindner [4] gave evidence that distinguishing ideal lattices from general ones can be done in polynomial time and showed that in practice randomly chosen lattices are never ideal. They only considered the case where the lattice has full rank, i.e. the basis consists of linear independent vectors. This is not a fundamental restriction because Lyubashevsky and Micciancio have shown that if a lattice is ideal with respect to an irreducible monic polynomial, then it has full rank, as given in the above lemma.
Algorithm: Identifying ideal lattices with full rank bases
Data: A full-rank basis
Result:true and , if spans an ideal lattice with respect to , otherwise false.
where the matrix M is
Using this algorithm, it can be seen that many lattices are not ideal lattices. For example, let and , then
is ideal, but
is not. with is an example given by Lyubashevsky and Micciancio. [5]
Performing the algorithm on it and referring to the basis as B, matrix B is already in Hermite Normal Form so the first step is not needed. The determinant is , the adjugate matrix
and finally, the product is
At this point the algorithm stops, because all but the last column of have to be zero if would span an ideal lattice.
Micciancio [6] introduced the class of structured cyclic lattices, which correspond to ideals in polynomial rings , and presented the first provably secure one-way function based on the worst-case hardness of the restriction of Poly(n)-SVP to cyclic lattices. (The problem γ-SVP consists in computing a non-zero vector of a given lattice, whose norm is no more than γ times larger than the norm of a shortest non-zero lattice vector.) At the same time, thanks to its algebraic structure, this one-way function enjoys high efficiency comparable to the NTRU scheme evaluation time and storage cost). Subsequently, Lyubashevsky and Micciancio [5] and independently Peikert and Rosen [7] showed how to modify Micciancio's function to construct an efficient and provably secure collision resistant hash function. For this, they introduced the more general class of ideal lattices, which correspond to ideals in polynomial rings . The collision resistance relies on the hardness of the restriction of Poly(n)-SVP to ideal lattices (called Poly(n)-Ideal-SVP). The average-case collision-finding problem is a natural computational problem called Ideal-SIS, which has been shown to be as hard as the worst-case instances of Ideal-SVP. Provably secure efficient signature schemes from ideal lattices have also been proposed, [1] [8] but constructing efficient provably secure public key encryption from ideal lattices was an interesting open problem.
The fundamental idea of using LWE and Ring LWE for key exchange was proposed and filed at the University of Cincinnati in 2011 by Jintai Ding and provided a state of the art description of a quantum resistant key exchange using Ring LWE. The paper [9] appeared in 2012 after a provisional patent application was filed in 2012. In 2014, Peikert [10] presented a key transport scheme following the same basic idea of Ding's, where the new idea of sending additional signal for rounding in Ding's construction is also utilized. A digital signature using the same concepts was done several years earlier by Vadim Lyubashevsky in, "Lattice Signatures Without Trapdoors." [11] Together, the work of Peikert and Lyubashevsky provide a suite of Ring-LWE based quantum attack resistant algorithms with the same security reductions.
The main usefulness of the ideal lattices in cryptography stems from the fact that very efficient and practical collision resistant hash functions can be built based on the hardness of finding an approximate shortest vector in such lattices. [1] Independently constructed collision resistant hash functions by Peikert and Rosen, [7] as well as Lyubashevsky and Micciancio, based on ideal lattices (a generalization of cyclic lattices), and provided a fast and practical implementation. [3] These results paved the way for other efficient cryptographic constructions including identification schemes and signatures.
Lyubashevsky and Micciancio [5] gave constructions of efficient collision resistant hash functions that can be proven secure based on worst case hardness of the shortest vector problem for ideal lattices. They defined hash function families as: Given a ring , where is a monic, irreducible polynomial of degree and is an integer of order roughly , generate random elements , where is a constant. The ordered -tuple determines the hash function. It will map elements in , where is a strategically chosen subset of , to . For an element , the hash is . Here the size of the key (the hash function) is , and the operation can be done in time by using the Fast Fourier Transform (FFT) [ citation needed ], for appropriate choice of the polynomial . Since is a constant, hashing requires time . They proved that the hash function family is collision resistant by showing that if there is a polynomial-time algorithm that succeeds with non-negligible probability in finding such that , for a randomly chosen hash function , then a certain problem called the “shortest vector problem” is solvable in polynomial time for every ideal of the ring .
Based on the work of Lyubashevsky and Micciancio in 2006, Micciancio and Regev [12] defined the following algorithm of hash functions based on ideal lattices:
Here are parameters, f is a vector in and is a block-matrix with structured blocks .
Finding short vectors in on the average (even with just inverse polynomial probability) is as hard as solving various lattice problems (such as approximate SVP and SIVP) in the worst case over ideal lattices, provided the vector f satisfies the following two properties:
The first property is satisfied by the vector corresponding to circulant matrices, because all the coordinates of [F∗u]v are bounded by 1, and hence . However, the polynomial corresponding to is not irreducible because it factors into , and this is why collisions can be efficiently found. So, is not a good choice to get collision resistant hash functions, but many other choices are possible. For example, some choices of f for which both properties are satisfied (and therefore, result in collision resistant hash functions with worst-case security guarantees) are
Digital signatures schemes are among the most important cryptographic primitives. They can be obtained by using the one-way functions based on the worst-case hardness of lattice problems. However, they are impractical. A number of new digital signature schemes based on learning with errors, ring learning with errors and trapdoor lattices have been developed since the learning with errors problem was applied in a cryptographic context.
Their direct construction of digital signatures based on the complexity of approximating the shortest vector in ideal (e.g., cyclic) lattices. [8] The scheme of Lyubashevsky and Micciancio [8] has worst-case security guarantees based on ideal lattices and it is the most asymptotically efficient construction known to date, yielding signature generation and verification algorithms that run in almost linear time. [12]
One of the main open problems that was raised by their work is constructing a one-time signature with similar efficiency, but based on a weaker hardness assumption. For instance, it would be great to provide a one-time signature with security based on the hardness of approximating the Shortest Vector Problem (SVP) (in ideal lattices) to within a factor of . [8]
Their construction is based on a standard transformation from one-time signatures (i.e. signatures that allow to securely sign a single message) to general signature schemes, together with a novel construction of a lattice based one-time signature whose security is ultimately based on the worst-case hardness of approximating the shortest vector in all lattices corresponding to ideals in the ring for any irreducible polynomial .
Key-Generation Algorithm:Input: , irreducible polynomial of degree .
Signing Algorithm:
Input: Message such that ; signing key
Output:
Verification Algorithm:
Input: Message ; signature ; verification key
Output: “ACCEPT”, if and
“REJECT”, otherwise.
The hash function is quite efficient and can be computed asymptotically in time using the Fast Fourier Transform (FFT) over the complex numbers. However, in practice, this carries a substantial overhead. The SWIFFT family of hash functions defined by Micciancio and Regev [12] is essentially a highly optimized variant of the hash function above using the (FFT) in . The vector f is set to for equal to a power of 2, so that the corresponding polynomial is irreducible. Let be a prime number such that divides , and let be an invertible matrix over to be chosen later. The SWIFFT hash function maps a key consisting of vectors chosen uniformly from and an input to where is as before and . Multiplication by the invertible matrix maps a uniformly chosen to a uniformly chosen . Moreover, if and only if . Together, these two facts establish that finding collisions in SWIFFT is equivalent to finding collisions in the underlying ideal lattice function , and the claimed collision resistance property of SWIFFT is supported by the connection to worst case lattice problems on ideal lattices.
The algorithm of the SWIFFT hash function is:
Learning with errors (LWE) problem has been shown to be as hard as worst-case lattice problems and has served as the foundation for many cryptographic applications. However, these applications are inefficient because of an inherent quadratic overhead in the use of LWE. To get truly efficient LWE applications, Lyubashevsky, Peikert and Regev [3] defined an appropriate version of the LWE problem in a wide class of rings and proved its hardness under worst-case assumptions on ideal lattices in these rings. They called their LWE version ring-LWE.
Let , where the security parameter is a power of 2, making irreducible over the rationals. (This particular comes from the family of cyclotomic polynomials, which play a special role in this work).
Let be the ring of integer polynomials modulo . Elements of (i.e., residues modulo ) are typically represented by integer polynomials of degree less than . Let be a sufficiently large public prime modulus (bounded by a polynomial in ), and let be the ring of integer polynomials modulo both and . Elements of may be represented by polynomials of degree less than -whose coefficients are from .
In the above-described ring, the R-LWE problem may be described as follows. Let be a uniformly random ring element, which is kept secret. Analogously to standard LWE, the goal of the attacker is to distinguish arbitrarily many (independent) ‘random noisy ring equations’ from truly uniform ones. More specifically, the noisy equations are of the form , where a is uniformly random and the product is perturbed by some ‘small’ random error term, chosen from a certain distribution over .
They gave a quantum reduction from approximate SVP (in the worst case) on ideal lattices in to the search version of ring-LWE, where the goal is to recover the secret (with high probability, for any ) from arbitrarily many noisy products. This result follows the general outline of Regev's iterative quantum reduction for general lattices, [13] but ideal lattices introduce several new technical roadblocks in both the ‘algebraic’ and ‘geometric’ components of the reduction. They [3] used algebraic number theory, in particular, the canonical embedding of a number field and the Chinese Remainder Theorem to overcome these obstacles. They got the following theorem:
Theorem Let be an arbitrary number field of degree . Let be arbitrary, and let the (rational) integer modulus be such that . There is a probabilistic polynomial-time quantum reduction from - to - , where .
In 2013, Guneysu, Lyubashevsky, and Poppleman proposed a digital signature scheme based on the Ring Learning with Errors problem. [14] In 2014, Peikert presented a Ring Learning with Errors Key Exchange (RLWE-KEX) in his paper, "Lattice Cryptography for the Internet." [10] This was further developed by the work of Singh. [15]
Stehle, Steinfeld, Tanaka and Xagawa [16] defined a structured variant of LWE problem (Ideal-LWE) to describe an efficient public key encryption scheme based on the worst case hardness of the approximate SVP in ideal lattices. This is the first CPA-secure public key encryption scheme whose security relies on the hardness of the worst-case instances of -Ideal-SVP against subexponential quantum attacks. It achieves asymptotically optimal efficiency: the public/private key length is bits and the amortized encryption/decryption cost is bit operations per message bit (encrypting bits at once, at a cost). The security assumption here is that -Ideal-SVP cannot be solved by any subexponential time quantum algorithm. It is noteworthy that this is stronger than standard public key cryptography security assumptions. On the other hand, contrary to the most of public key cryptography, lattice-based cryptography allows security against subexponential quantum attacks.
Most of the cryptosystems based on general lattices rely on the average-case hardness of the Learning with errors (LWE). Their scheme is based on a structured variant of LWE, that they call Ideal-LWE. They needed to introduce some techniques to circumvent two main difficulties that arise from the restriction to ideal lattices. Firstly, the previous cryptosystems based on unstructured lattices all make use of Regev's worst-case to average-case classical reduction from Bounded Distance Decoding problem (BDD) to LWE (this is the classical step in the quantum reduction from SVP to LWE). This reduction exploits the unstructured-ness of the considered lattices, and does not seem to carry over to the structured lattices involved in Ideal-LWE. In particular, the probabilistic independence of the rows of the LWE matrices allows to consider a single row. Secondly, the other ingredient used in previous cryptosystems, namely Regev's reduction from the computational variant of LWE to its decisional variant, also seems to fail for Ideal-LWE: it relies on the probabilistic independence of the columns of the LWE matrices.
To overcome these difficulties, they avoided the classical step of the reduction. Instead, they used the quantum step to construct a new quantum average-case reduction from SIS (average-case collision-finding problem) to LWE. It also works from Ideal-SIS to Ideal-LWE. Combined with the reduction from worst-case Ideal-SVP to average-case Ideal-SIS, they obtained the a quantum reduction from Ideal-SVP to Ideal-LWE. This shows the hardness of the computational variant of Ideal-LWE. Because they did not obtain the hardness of the decisional variant, they used a generic hardcore function to derive pseudorandom bits for encryption. This is why they needed to assume the exponential hardness of SVP.
A fully homomorphic encryption (FHE) scheme is one which allows for computation over encrypted data, without first needing to decrypt. The problem of constructing a fully homomorphic encryption scheme was first put forward by Rivest, Adleman and Dertouzos [17] in 1978, shortly after the invention of RSA by Rivest, Adleman and Shamir. [18]
An encryption scheme is homomorphic for circuits in if, for any circuit ,
given , , and ,
it holds that .
is fully homomorphic if it is homomorphic for all circuits of size where is the scheme's security parameter.
In 2009, Gentry [19] proposed the first solution to the problem of constructing a fully homomorphic encryption scheme. His scheme was based on ideal lattices.
In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist. Informally, a ring is a set equipped with two binary operations satisfying properties analogous to those of addition and multiplication of integers. Ring elements may be numbers such as integers or complex numbers, but they may also be non-numerical objects such as polynomials, square matrices, functions, and power series.
A commitment scheme is a cryptographic primitive that allows one to commit to a chosen value while keeping it hidden to others, with the ability to reveal the committed value later. Commitment schemes are designed so that a party cannot change the value or statement after they have committed to it: that is, commitment schemes are binding. Commitment schemes have important applications in a number of cryptographic protocols including secure coin flipping, zero-knowledge proofs, and secure computation.
The NTRUEncrypt public key cryptosystem, also known as the NTRU encryption algorithm, is an NTRU lattice-based alternative to RSA and elliptic curve cryptography (ECC) and is based on the shortest vector problem in a lattice.
In mathematics, differential algebra is, broadly speaking, the area of mathematics consisting in the study of differential equations and differential operators as algebraic objects in view of deriving properties of differential equations and operators without computing the solutions, similarly as polynomial algebras are used for the study of algebraic varieties, which are solution sets of systems of polynomial equations. Weyl algebras and Lie algebras may be considered as belonging to differential algebra.
In mathematics, blowing up or blowup is a type of geometric transformation which replaces a subspace of a given space with the space of all directions pointing out of that subspace. For example, the blowup of a point in a plane replaces the point with the projectivized tangent space at that point. The metaphor is that of zooming in on a photograph to enlarge part of the picture, rather than referring to an explosion.
Schoof's algorithm is an efficient algorithm to count points on elliptic curves over finite fields. The algorithm has applications in elliptic curve cryptography where it is important to know the number of points to judge the difficulty of solving the discrete logarithm problem in the group of points on an elliptic curve.
In mathematics, a Witt vector is an infinite sequence of elements of a commutative ring. Ernst Witt showed how to put a ring structure on the set of Witt vectors, in such a way that the ring of Witt vectors over the finite field of order is isomorphic to , the ring of -adic integers. They have a highly non-intuitive structure upon first glance because their additive and multiplicative structure depends on an infinite set of recursive formulas which do not behave like addition and multiplication formulas for standard p-adic integers.
Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice-based constructions support important standards of post-quantum cryptography. Unlike more widely used and known public-key schemes such as the RSA, Diffie-Hellman or elliptic-curve cryptosystems — which could, theoretically, be defeated using Shor's algorithm on a quantum computer — some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
In computer science, lattice problems are a class of optimization problems related to mathematical objects called lattices. The conjectured intractability of such problems is central to the construction of secure lattice-based cryptosystems: lattice problems are an example of NP-hard problems which have been shown to be average-case hard, providing a test case for the security of cryptographic algorithms. In addition, some lattice problems which are worst-case hard can be used as a basis for extremely secure cryptographic schemes. The use of worst-case hardness in such schemes makes them among the very few schemes that are very likely secure even against quantum computers. For applications in such cryptosystems, lattices over vector spaces or free modules are generally considered.
In cryptography, learning with errors (LWE) is a mathematical problem that is widely used to create secure encryption algorithms. It is based on the idea of representing secret information as a set of equations with errors. In other words, LWE is a way to hide the value of a secret by introducing noise to it. In more technical terms, it refers to the computational problem of inferring a linear -ary function over a finite ring from given samples some of which may be erroneous. The LWE problem is conjectured to be hard to solve, and thus to be useful in cryptography.
In statistics and combinatorial mathematics, group testing is any procedure that breaks up the task of identifying certain objects into tests on groups of items, rather than on individual ones. First studied by Robert Dorfman in 1943, group testing is a relatively new field of applied mathematics that can be applied to a wide range of practical applications and is an active area of research today.
In cryptography, SWIFFT is a collection of provably secure hash functions. It is based on the concept of the fast Fourier transform (FFT). SWIFFT is not the first hash function based on FFT, but it sets itself apart by providing a mathematical proof of its security. It also uses the LLL basis reduction algorithm. It can be shown that finding collisions in SWIFFT is at least as difficult as finding short vectors in cyclic/ideal lattices in the worst case. By giving a security reduction to the worst-case scenario of a difficult mathematical problem, SWIFFT gives a much stronger security guarantee than most other cryptographic hash functions.
Post-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms that are thought to be secure against a cryptanalytic attack by a quantum computer. The problem with popular algorithms currently used in the market is that their security relies on one of three hard mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm or even faster and less demanding alternatives.
In mathematics, an algebraic number field is an extension field of the field of rational numbers such that the field extension has finite degree . Thus is a field that contains and has finite dimension when considered as a vector space over .
Network coding has been shown to optimally use bandwidth in a network, maximizing information flow but the scheme is very inherently vulnerable to pollution attacks by malicious nodes in the network. A node injecting garbage can quickly affect many receivers. The pollution of network packets spreads quickly since the output of honest node is corrupted if at least one of the incoming packets is corrupted.
Digital signatures are a means to protect digital information from intentional modification and to authenticate the source of digital information. Public key cryptography provides a rich set of different cryptographic algorithms the create digital signatures. However, the primary public key signatures currently in use will become completely insecure if scientists are ever able to build a moderately sized quantum computer. Post quantum cryptography is a class of cryptographic algorithms designed to be resistant to attack by a quantum cryptography. Several post quantum digital signature algorithms based on hard problems in lattices are being created replace the commonly used RSA and elliptic curve signatures. A subset of these lattice based scheme are based on a problem known as Ring learning with errors. Ring learning with errors based digital signatures are among the post quantum signatures with the smallest public key and signature sizes
In post-quantum cryptography, ring learning with errors (RLWE) is a computational problem which serves as the foundation of new cryptographic algorithms, such as NewHope, designed to protect against cryptanalysis by quantum computers and also to provide the basis for homomorphic encryption. Public-key cryptography relies on construction of mathematical problems that are believed to be hard to solve if no further information is available, but are easy to solve if some information used in the problem construction is known. Some problems of this sort that are currently used in cryptography are at risk of attack if sufficiently large quantum computers can ever be built, so resistant problems are sought. Homomorphic encryption is a form of encryption that allows computation on ciphertext, such as arithmetic on numeric values stored in an encrypted database.
In cryptography, a public key exchange algorithm is a cryptographic algorithm which allows two parties to create and share a secret key, which they can use to encrypt messages between themselves. The ring learning with errors key exchange (RLWE-KEX) is one of a new class of public key exchange algorithms that are designed to be secure against an adversary that possesses a quantum computer. This is important because some public key algorithms in use today will be easily broken by a quantum computer if such computers are implemented. RLWE-KEX is one of a set of post-quantum cryptographic algorithms which are based on the difficulty of solving certain mathematical problems involving lattices. Unlike older lattice based cryptographic algorithms, the RLWE-KEX is provably reducible to a known hard problem in lattices.
Short integer solution (SIS) and ring-SIS problems are two average-case problems that are used in lattice-based cryptography constructions. Lattice-based cryptography began in 1996 from a seminal work by Miklós Ajtai who presented a family of one-way functions based on SIS problem. He showed that it is secure in an average case if the shortest vector problem (where for some constant ) is hard in a worst-case scenario.
HEAAN is an open source homomorphic encryption (HE) library which implements an approximate HE scheme proposed by Cheon, Kim, Kim and Song (CKKS). The first version of HEAAN was published on GitHub on 15 May 2016, and later a new version of HEAAN with a bootstrapping algorithm was released. Currently, the latest version is Version 2.1.