General | |
---|---|
Designers | Daniel R. L. Brown, Matt Campagna, Rene Struik |
First published | 2008 |
Derived from | MuHASH |
Detail | |
Digest sizes | 224, 256, 384 or 512 |
Best public cryptanalysis | |
Second Pre-Image |
The elliptic curve only hash (ECOH) algorithm was submitted as a candidate for SHA-3 in the NIST hash function competition. However, it was rejected in the beginning of the competition since a second pre-image attack was found.
The NIST hash function competition was an open competition held by the US National Institute of Standards and Technology (NIST) to develop a new hash function called SHA-3 to complement the older SHA-1 and SHA-2. The competition was formally announced in the Federal Register on November 2, 2007. "NIST is initiating an effort to develop one or more additional hash algorithms through a public competition, similar to the development process for the Advanced Encryption Standard (AES)." The competition ended on October 2, 2012 when the NIST announced that Keccak would be the new SHA-3 hash algorithm.
In cryptography, a preimage attack on cryptographic hash functions tries to find a message that has a specific hash value. A cryptographic hash function should resist attacks on its preimage.
The ECOH is based on the MuHASH hash algorithm, that has not yet been successfully attacked. However, MuHASH is too inefficient for practical use and changes had to be made. The main difference is that where MuHASH applies a random oracle [ clarification needed ], ECOH applies a padding function. Assuming random oracles, finding a collision in MuHASH implies solving the discrete logarithm problem. MuHASH is thus a provably secure hash, i.e. we know that finding a collision is at least as hard as some hard known mathematical problem.
In cryptography, a random oracle is an oracle that responds to every unique query with a (truly) random response chosen uniformly from its output domain. If a query is repeated it responds the same way every time that query is submitted.
In cryptography, padding refers to a number of distinct practices which all include adding data to the beginning, middle, or end of a message prior to encryption. In classical cryptography, padding may include adding nonsense phrases to a message to obscure the fact that many messages end in predictable ways, e.g. sincerely yours.
ECOH does not use random oracles and its security is not strictly directly related to the discrete logarithm problem, yet it is still based on mathematical functions. ECOH is related to the Semaev's problem of finding low degree solutions to the summation polynomial equations over binary field, called the Summation Polynomial Problem. An efficient algorithm to solve this problem has not been given so far. Although the problem was not proven to be NP-hard, it is assumed that such an algorithm does not exist. Under certain assumptions, finding a collision in ECOH may be also viewed as an instance of the subset sum problem. Besides solving the Summation Polynomial Problem, there exists another way how to find second pre-images and thus collisions, Wagner's generalized birthday attack.
In computer science, the subset sum problem is an important decision problem in complexity theory and cryptography. There are several equivalent formulations of the problem. One of them is: given a set of integers, is there a non-empty subset whose sum is zero? For example, given the set , the answer is yes because the subset sums to zero. The problem is NP-complete, meaning roughly that while it is easy to confirm whether a proposed solution is valid, it may inherently be prohibitively difficult to determine in the first place whether any solution exists.
ECOH is a good example of hash function that is based on mathematical functions (with the provable security approach) rather than on classical ad hoc mixing of bits to obtain the hash.
Given , ECOH divides the message into blocks . If the last block is incomplete, it is padded with single 1 and then appropriate number of 0. Let furthermore be a function that maps a message block and an integer to an elliptic curve point. Then using the mapping , each block is transformed to an elliptic curve point , and these points are added together with two more points. One additional point contains the padding and depends only on the message length. The second additional point depends on the message length and the XOR of all message blocks. The result is truncated to get the hash .
In mathematics, an elliptic curve is a plane algebraic curve defined by an equation of the form
In mathematics and computer science, truncation is limiting the number of digits right of the decimal point.
The two extra points are computed by and . adds all the elliptic curve points and the two extra points together. Finally, the result is passed through an output transformation function f to get the hash result . To read more about this algorithm, see "ECOH: the Elliptic Curve Only Hash".
Four ECOH algorithms were proposed, ECOH-224, ECOH-256, ECOH-384 and ECOH-512. The number represents the size of the message digest. They differ in the length of parameters, block size and in the used elliptic curve. The first two uses the elliptic curve B-283: , with parameters (128, 64, 64). ECOH-384 uses the curve B-409: , with parameters (192, 64, 64). ECOH-512 uses the curve B-571: , with parameters (256, 128, 128). It can hash messages of bit length up to .
The ECOH hash functions are based on concrete mathematical functions. They were designed such that the problem of finding collisions should be reducible to a known and hard mathematical problem (the subset sum problem). It means that if one could find collisions, one would be able to solve the underlying mathematical problem which is assumed to be hard and unsolvable in polynomial time. Functions with these properties are known provably secure and are quite unique among the rest of hash functions. Nevertheless, second pre-image (and thus a collision) was later found because the assumptions given in the proof were too strong.
One way of finding collisions or second pre-images is solving Semaev Summation Polynomials. For a given elliptic curve E, there exists polynomials that are symmetric in variables and that vanish exactly when evaluated at abscissae of points whose sum is 0 in . So far, an efficient algorithm to solve this problem does not exist and it is assumed to be hard (but not proven to be NP-hard).
More formally: Let be a finite field, be an elliptic curve with Weierstrass equation having coefficients in and be the point of infinity. It is known that there exists a multivariable polynomial if and only if there exist < such that . This polynomial has degree in each variable. The problem is to find this polynomial.
The problem of finding collisions in ECOH is similar to the subset sum problem. Solving a subset sum problem is almost as hard as the discrete logarithm problem. It is generally assumed that this is not doable in polynomial time. However a significantly loose heuristic must be assumed, more specifically, one of the involved parameters in the computation is not necessarily random but has a particular structure. If one adopts this loose heuristic, then finding an internal ECOH collision may be viewed as an instance of the subset sum problem.
A second pre-image attack exists in the form of generalized birthday attack.
Description of the attack: This is a Wagner’s Generalized Birthday Attack. It requires 2143 time for ECOH-224 and ECOH-256, 2206 time for ECOH-384, and 2287 time for ECOH-512. The attack sets the checksum block to a fixed value and uses a collision search on the elliptic curve points. For this attack we have a message M and try to find a M' that hashes to the same message. We first split the message length into six blocks. So . Let K be a natural number. We choose K different numbers for and define by . We compute the K corresponding elliptic curve points and store them in a list. We then choose K different random values for , define , we compute , and store them in a second list. Note that the target Q is known. only depends on the length of the message which we have fixed. depends on the length and the XOR of all message blocks, but we choose the message blocks such that this is always zero. Thus, is fixed for all our tries.
If K is larger than the square root of the number of points on the elliptic curve then we expect one collision between the two lists. This gives us a message with: This means that this message leads to the target value Q and thus to a second preimage, which was the question. The workload we have to do here is two times K partial hash computations. For more info, see "A Second Pre-image Attack Against Elliptic Curve Only Hash (ECOH)".
Actual parameters:
The official comments on ECOH included a proposal called ECOH2 that doubles the elliptic curve size in an effort to stop the Halcrow-Ferguson second preimage attack with a prediction of improved or similar performance.
In mathematics, a plane real algebraic curve is the set of points on the Euclidean plane whose coordinates are zeros of some polynomial in two variables. More generally an algebraic curve is similar but may be embedded in a higher dimensional space or defined over some more general field.
In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. Not being one-to-one is not considered sufficient of a function for it to be called one-way.
In number theory, the local zeta function is defined as
In cryptography, the Elliptic Curve Digital Signature Algorithm (ECDSA) offers a variant of the Digital Signature Algorithm (DSA) which uses elliptic curve cryptography.
Hyperelliptic curve cryptography is similar to elliptic curve cryptography (ECC) insofar as the Jacobian of a hyperelliptic curve is an abelian group in which to do arithmetic, just as we use the group of points on an elliptic curve in ECC.
Collision resistance is a property of cryptographic hash functions: a hash function H is collision resistant if it is hard to find two inputs that hash to the same output; that is, two inputs a and b such that H(a) = H(b), and a ≠ b.
In cryptography, a one-way compression function is a function that transforms two fixed-length inputs into a fixed-length output. The transformation is "one-way", meaning that it is difficult given a particular output to compute inputs which compress to that output. One-way compression functions are not related to conventional data compression algorithms, which instead can be inverted exactly or approximately to the original data.
In mathematics and computing, universal hashing refers to selecting a hash function at random from a family of hash functions with a certain mathematical property. This guarantees a low number of collisions in expectation, even if the data is chosen by an adversary. Many universal families are known, and their evaluation is often very efficient. Universal hashing has numerous uses in computer science, for example in implementations of hash tables, randomized algorithms, and cryptography.
In cryptography, the Merkle–Damgård construction or Merkle–Damgård hash function is a method of building collision-resistant cryptographic hash functions from collision-resistant one-way compression functions. This construction was used in the design of many popular hash algorithms such as MD5, SHA1 and SHA2.
The Rabin fingerprinting scheme is a method for implementing fingerprints using polynomials over a finite field. It was proposed by Michael O. Rabin.
An important aspect in the study of elliptic curves is devising effective ways of counting points on the curve. There have been several approaches to do so, and the algorithms devised have proved to be useful tools in the study of various fields such as number theory, and more recently in cryptography and Digital Signature Authentication. While in number theory they have important consequences in the solving of Diophantine equations, with respect to cryptography, they enable us to make effective use of the difficulty of the discrete logarithm problem (DLP) for the group , of elliptic curves over a finite field , where q = pk and p is a prime. The DLP, as it has come to be known, is a widely used approach to public key cryptography, and the difficulty in solving this problem determines the level of security of the cryptosystem. This article covers algorithms to count points on elliptic curves over fields of large characteristic, in particular p > 3. For curves over fields of small characteristic more efficient algorithms based on p-adic methods exist.
In cryptography, Very Smooth Hash (VSH) is a provably secure cryptographic hash function invented in 2005 by Scott Contini, Arjen Lenstra and Ron Steinfeld. Provably secure means that finding collisions is as difficult as some known hard mathematical problem. Unlike other provably secure collision-resistant hashes, VSH is efficient and usable in practice. Asymptotically, it only requires a single multiplication per log(n) message-bits and uses RSA-type arithmetic. Therefore, VSH can be useful in embedded environments where code space is limited.
In cryptography, the fast syndrome-based hash functions (FSB) are a family of cryptographic hash functions introduced in 2003 by Daniel Augot, Matthieu Finiasz, and Nicolas Sendrier. Unlike most other cryptographic hash functions in use today, FSB can to a certain extent be proven to be secure. More exactly, it can be proven that breaking FSB is at least as difficult as solving a certain NP-complete problem known as regular syndrome decoding so FSB is provably secure. Though it is not known whether NP-complete problems are solvable in polynomial time, it is often assumed that they are not.
In cryptography, cryptographic hash functions can be divided into two main categories. In the first category are those functions whose designs are based on a mathematical problem and thus their security follows from rigorous mathematical proofs, complexity theory and formal reduction. These functions are called Provably Secure Cryptographic Hash Functions. However this does not mean that such a function could not be broken. To construct them is very difficult and only a few examples were introduced. The practical use is limited.
In cryptography, SWIFFT is a collection of provably secure hash functions. It is based on the concept of the fast Fourier transform (FFT). SWIFFT is not the first hash function based on FFT, but it sets itself apart by providing a mathematical proof of its security. It also uses the LLL basis reduction algorithm. It can be shown that finding collisions in SWIFFT is at least as difficult as finding short vectors in cyclic/ideal lattices in the worst case. By giving a security reduction to the worst-case scenario of a difficult mathematical problem, SWIFFT gives a much stronger security guarantee than most other cryptographic hash functions.
In mathematics elliptic curve primality testing techniques, or elliptic curve primality proving (ECPP), are among the quickest and most widely used methods in primality proving. It is an idea put forward by Shafi Goldwasser and Joe Kilian in 1986 and turned into an algorithm by A. O. L. Atkin the same year. The algorithm was altered and improved by several collaborators subsequently, and notably by Atkin and François Morain, in 1993. The concept of using elliptic curves in factorization had been developed by H. W. Lenstra in 1985, and the implications for its use in primality testing followed quickly.
Ideal lattices are a special class of lattices and a generalization of cyclic lattices. Ideal lattices naturally occur in many parts of number theory, but also in other areas. In particular, they have a significant place in cryptography. Micciancio defined a generalization of cyclic lattices as ideal lattices. They can be used in cryptosystems to decrease by a square root the number of parameters necessary to describe a lattice, making them more efficient. Ideal lattices are a new concept, but similar lattice classes have been used for a long time. For example cyclic lattices, a special case of ideal lattices, are used in NTRUEncrypt and NTRUSign.
Network coding has been shown to optimally use bandwidth in a network, maximizing information flow but the scheme is very inherently vulnerable to pollution attacks by malicious nodes in the network. A node injecting garbage can quickly affect many receivers. The pollution of network packets spreads quickly since the output of honest node is corrupted if at least one of the incoming packets is corrupted. An attacker can easily corrupt a packet even if it is encrypted by either forging the signature or by producing a collision under the hash function. This will give an attacker access to the packets and the ability to corrupt them. Denis Charles, Kamal Jain and Kristin Lauter designed a new homomorphic encryption signature scheme for use with network coding to prevent pollution attacks. The homomorphic property of the signatures allows nodes to sign any linear combination of the incoming packets without contacting the signing authority. In this scheme it is computationally infeasible for a node to sign a linear combination of the packets without disclosing what linear combination was used in the generation of the packet. Furthermore, we can prove that the signature scheme is secure under well known cryptographic assumptions of the hardness of the discrete logarithm problem and the computational Elliptic curve Diffie–Hellman.