Verifiable random function

Last updated

In cryptography, a verifiable random function (VRF) is a public-key pseudorandom function that provides proofs that its outputs were calculated correctly. The owner of the secret key can compute the function value as well as an associated proof for any input value. Everyone else, using the proof and the associated public key (or verification key [1] ), can check that this value was indeed calculated correctly, yet this information cannot be used to find the secret key. [2]

Contents

A verifiable random function can be viewed as a public-key analogue of a keyed cryptographic hash [2] and as a cryptographic commitment to an exponentially large number of seemingly random bits. [3] The concept of a verifiable random function is closely related to that of a verifiable unpredictable function (VUF), whose outputs are hard to predict but do not necessarily seem random. [3] [4]

The concept of a VRF was introduced by Micali, Rabin, and Vadhan in 1999. [4] [5] Since then, verifiable random functions have found widespread use in cryptocurrencies, as well as in proposals for protocol design and cybersecurity.

Constructions

In 1999, Micali, Rabin, and Vadhan introduced the concept of a VRF and proposed the first such one. [4] The original construction was rather inefficient: it first produces a verifiable unpredictable function, then uses a hard-core bit to transform it into a VRF; moreover, the inputs have to be mapped to primes in a complicated manner: namely, by using a prime sequence generator that generates primes with overwhelming probability using a probabilistic primality test. [3] [4] The verifiable unpredictable function thus proposed, which is provably secure if a variant of the RSA problem is hard, is defined as follows: The public key PK is , where m is the product of two random primes, r is a number randomly selected from , coins is a randomly selected set of bits, and Q a function selected randomly from all polynomials of degree over the field . The secret key is . Given an input x and a secret key SK, the VUF uses the prime sequence generator to pick a corresponding prime (the generator requires auxiliary inputs Q and coins), and then computes and outputs , which is easily done by knowledge of . [4]

In 2005, an efficient and practical verifiable random function was proposed by Dodis and Yampolskiy. [3] [6] When the input is from a small domain (the authors then extend it to a larger domain), the function can be defined as follows:

where e(·,·) is a bilinear map. To verify whether was computed correctly or not, one can check if and . [3] [6] To extend this to a larger domain, the authors use a tree construction and a universal hash function. [3] This is secure if it is hard to break the "q-Diffie-Helman inversion assumption", which states that no algorithm given can compute , and the "q-decisional bilinear Diffie-Helman inversion assumption", which states that it is impossible for an efficient algorithm given as input to distinguish from random, in the group . [3] [6]

In 2015, Hofheinz and Jager constructed a VRF which is provably secure given any member of the "(n − 1)-linear assumption family", which includes the decision linear assumption. [7] This is the first such VRF constructed that does not depend on a "Q-type complexity assumption". [7]

In 2019, Bitansky showed that VRFs exist if non-interactive witness-indistinguishable proofs (that is, weaker versions of non-interactive zero-knowledge proofs for NP problems that only hide the witness that the prover uses [1] [8] ), non-interactive cryptographic commitments, and single-key constrained pseudorandom functions (that is, pseudorandom functions that only allow the user to evaluate the function with a preset constrained subset of possible inputs [9] ) also do. [1]

When an Oblivious Pseudorandom Function is based on asymmetric cryptography, possession of the public key can allow the client to verify the output of the function, by checking a digital signature or a zero-knowledge proof.

In 2020, Esgin et al. proposed a post-quantum secure VRF based on lattice-based cryptography. [10]

Uses and applications

VRFs provide deterministic pre-commitments for low entropy inputs which must be resistant to brute-force pre-image attacks. [11] [ better source needed ] VRFs can be used for defense against offline enumeration attacks (such as dictionary attacks) on data stored in hash-based data structures. [2]

In protocol design

VRFs have been used to make:

VRFs can also be used to implement random oracles. [13]

In Internet security

DNSSEC is a system that prevents attackers from tampering with Domain Name System messages, but it also suffers from the vulnerability of zone enumeration. The proposed NSEC5 system, which uses VRFs[ how? ], provably prevents this type of attack. [14] [ importance? ]

Related Research Articles

In cryptography, a block cipher is a deterministic algorithm that operates on fixed-length groups of bits, called blocks. Block ciphers are the elementary building blocks of many cryptographic protocols. They are ubiquitous in the storage and exchange of data, where such data is secured and authenticated via encryption.

A cryptographically secure pseudorandom number generator (CSPRNG) or cryptographic pseudorandom number generator (CPRNG) is a pseudorandom number generator (PRNG) with properties that make it suitable for use in cryptography. It is also loosely known as a cryptographic random number generator (CRNG).

In cryptography, a Feistel cipher is a symmetric structure used in the construction of block ciphers, named after the German-born physicist and cryptographer Horst Feistel, who did pioneering research while working for IBM; it is also commonly known as a Feistel network. A large proportion of block ciphers use the scheme, including the US Data Encryption Standard, the Soviet/Russian GOST and the more recent Blowfish and Twofish ciphers. In a Feistel cipher, encryption and decryption are very similar operations, and both consist of iteratively running a function called a "round function" a fixed number of times.

In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. Not being one-to-one is not considered sufficient for a function to be called one-way.

A commitment scheme is a cryptographic primitive that allows one to commit to a chosen value while keeping it hidden to others, with the ability to reveal the committed value later. Commitment schemes are designed so that a party cannot change the value or statement after they have committed to it: that is, commitment schemes are binding. Commitment schemes have important applications in a number of cryptographic protocols including secure coin flipping, zero-knowledge proofs, and secure computation.

In cryptography, a Schnorr signature is a digital signature produced by the Schnorr signature algorithm that was described by Claus Schnorr. It is a digital signature scheme known for its simplicity, among the first whose security is based on the intractability of certain discrete logarithm problems. It is efficient and generates short signatures. It was covered by U.S. patent 4,995,082 which expired in February 2010.

In cryptography, a zero-knowledge proof or zero-knowledge protocol is a method by which one party can prove to another party that a given statement is true, while avoiding conveying to the verifier any information beyond the mere fact of the statement's truth. The intuition underlying zero-knowledge proofs is that it is trivial to prove the possession of certain information by simply revealing it; the challenge is to prove this possession without revealing the information, or any aspect of it whatsoever.

The Paillier cryptosystem, invented by and named after Pascal Paillier in 1999, is a probabilistic asymmetric algorithm for public key cryptography. The problem of computing n-th residue classes is believed to be computationally difficult. The decisional composite residuosity assumption is the intractability hypothesis upon which this cryptosystem is based.

Provable security refers to any type or level of computer security that can be proved. It is used in different ways by different fields.

The Cramer–Shoup system is an asymmetric key encryption algorithm, and was the first efficient scheme proven to be secure against adaptive chosen ciphertext attack using standard cryptographic assumptions. Its security is based on the computational intractability of the Decisional Diffie–Hellman assumption. Developed by Ronald Cramer and Victor Shoup in 1998, it is an extension of the ElGamal cryptosystem. In contrast to ElGamal, which is extremely malleable, Cramer–Shoup adds other elements to ensure non-malleability even against a resourceful attacker. This non-malleability is achieved through the use of a universal one-way hash function and additional computations, resulting in a ciphertext which is twice as large as in ElGamal.

In cryptography, a pseudorandom function family, abbreviated PRF, is a collection of efficiently-computable functions which emulate a random oracle in the following way: no efficient algorithm can distinguish between a function chosen randomly from the PRF family and a random oracle. Pseudorandom functions are vital tools in the construction of cryptographic primitives, especially secure encryption schemes.

In cryptography, a secret sharing scheme is verifiable if auxiliary information is included that allows players to verify their shares as consistent. More formally, verifiable secret sharing ensures that even if the dealer is malicious there is a well-defined secret that the players can later reconstruct. The concept of verifiable secret sharing (VSS) was first introduced in 1985 by Benny Chor, Shafi Goldwasser, Silvio Micali and Baruch Awerbuch.

In cryptography, a pseudorandom permutation (PRP) is a function that cannot be distinguished from a random permutation (that is, a permutation selected at random with uniform probability, from the family of all permutations on the function's domain) with practical effort.

In computational complexity theory, a computational hardness assumption is the hypothesis that a particular problem cannot be solved efficiently. It is not known how to prove (unconditional) hardness for essentially any useful problem. Instead, computer scientists rely on reductions to formally relate the hardness of a new or complicated problem to a computational hardness assumption about a problem that is better-understood.

In cryptography, subliminal channels are covert channels that can be used to communicate secretly in normal looking communication over an insecure channel. Subliminal channels in digital signature crypto systems were found in 1984 by Gustavus Simmons.

In cryptography, SWIFFT is a collection of provably secure hash functions. It is based on the concept of the fast Fourier transform (FFT). SWIFFT is not the first hash function based on FFT, but it sets itself apart by providing a mathematical proof of its security. It also uses the LLL basis reduction algorithm. It can be shown that finding collisions in SWIFFT is at least as difficult as finding short vectors in cyclic/ideal lattices in the worst case. By giving a security reduction to the worst-case scenario of a difficult mathematical problem, SWIFFT gives a much stronger security guarantee than most other cryptographic hash functions.

In discrete mathematics, ideal lattices are a special class of lattices and a generalization of cyclic lattices. Ideal lattices naturally occur in many parts of number theory, but also in other areas. In particular, they have a significant place in cryptography. Micciancio defined a generalization of cyclic lattices as ideal lattices. They can be used in cryptosystems to decrease by a square root the number of parameters necessary to describe a lattice, making them more efficient. Ideal lattices are a new concept, but similar lattice classes have been used for a long time. For example, cyclic lattices, a special case of ideal lattices, are used in NTRUEncrypt and NTRUSign.

In cryptography, a public key exchange algorithm is a cryptographic algorithm which allows two parties to create and share a secret key, which they can use to encrypt messages between themselves. The ring learning with errors key exchange (RLWE-KEX) is one of a new class of public key exchange algorithms that are designed to be secure against an adversary that possesses a quantum computer. This is important because some public key algorithms in use today will be easily broken by a quantum computer if such computers are implemented. RLWE-KEX is one of a set of post-quantum cryptographic algorithms which are based on the difficulty of solving certain mathematical problems involving lattices. Unlike older lattice based cryptographic algorithms, the RLWE-KEX is provably reducible to a known hard problem in lattices.

Hash-based cryptography is the generic term for constructions of cryptographic primitives based on the security of hash functions. It is of interest as a type of post-quantum cryptography.

In cryptography, indistinguishability obfuscation is a type of software obfuscation with the defining property that obfuscating any two programs that compute the same mathematical function results in programs that cannot be distinguished from each other. Informally, such obfuscation hides the implementation of a program while still allowing users to run it. Formally, iO satisfies the property that obfuscations of two circuits of the same size which implement the same function are computationally indistinguishable.

References

  1. 1 2 3 Bitansky, Nir (2020-04-01). "Verifiable Random Functions from Non-interactive Witness-Indistinguishable Proofs". Journal of Cryptology. 33 (2): 459–493. doi:10.1007/s00145-019-09331-1. ISSN   1432-1378. S2CID   253636177.
  2. 1 2 3 Goldberg, Sharon; Vcelak, Jan; Papadopoulos, Dimitrios; Reyzin, Leonid (5 March 2018). Verifiable Random Functions (VRFs) (PDF) (Technical report). Retrieved 15 August 2021.
  3. 1 2 3 4 5 6 7 8 9 10 Dodis, Yevgeniy; Yampolskiy, Aleksandr (16 November 2004). "A Verifiable Random Function With Short Proofs and Keys" (PDF). 8th International Workshop on Theory and Practice in Public Key Cryptography. International Workshop on Public Key Cryptography. Springer, Berlin, Heidelberg (published 2005). pp. 416–431. ISBN   978-3-540-30580-4 . Retrieved 26 August 2021.
  4. 1 2 3 4 5 Micali, Silvio; Rabin, Michael O.; Vadhan, Salil P. (1999). "Verifiable random functions" (PDF). Proceedings of the 40th IEEE Symposium on Foundations of Computer Science. 40th Annual Symposium on Foundations of Computer Science. pp. 120–130. doi:10.1109/SFFCS.1999.814584. ISBN   0-7695-0409-4.
  5. Potter, John (9 September 2021). "How Can Value Investors Profit in the Crypto Ecosystem?". finance.yahoo.com. Retrieved 19 September 2021.
  6. 1 2 3 Nountu, Thierry Mefenza (28 November 2017). Pseudo-Random Generators and Pseudo-Random Functions: Cryptanalysis and Complexity Measures (Thèse de doctorat thesis).
  7. 1 2 3 4 5 6 7 Hofheinz, Dennis; Jager, Tibor (30 October 2015). Verifiable Random Functions from Standard Assumptions. Theory of Cryptography Conference (published 19 December 2015). pp. 336–362. CiteSeerX   10.1.1.738.9975 . doi:10.1007/978-3-662-49096-9_14. ISBN   978-3-662-49096-9.
  8. Barak, Boaz; Ong, Shien Jin; Vadhan, Salil (2007-01-01). "Derandomization in Cryptography" (PDF). SIAM Journal on Computing. 37 (2): 380–400. doi:10.1137/050641958. ISSN   0097-5397 . Retrieved 2 September 2021.
  9. Boneh, Dan; Waters, Brent (2013). "Constrained Pseudorandom Functions and Their Applications". In Sako, Kazue; Sarkar, Palash (eds.). Advances in Cryptology - ASIACRYPT 2013. Lecture Notes in Computer Science. Vol. 8270. Berlin, Heidelberg: Springer. pp. 280–300. doi: 10.1007/978-3-642-42045-0_15 . ISBN   978-3-642-42045-0 . Retrieved 2 September 2021.
  10. Esgin, Muhammed F.; Kuchta, Veronika; Sakzad, Amin; Steinfeld, Ron; Zhang, Zhenfei; Sun, Shifeng; Chu, Shumo (24 March 2021). "Practical Post-Quantum Few-Time Verifiable Random Function with Applications to Algorand". Cryptology ePrint Archive. Retrieved 26 August 2021.
  11. Schorn, Eric (2020-02-24). "Reviewing Verifiable Random Functions". NCC Group Research. Retrieved 2021-09-04.
  12. Micali, Silvio; Reyzin, Leonid (2001). "Soundness in the Public-Key Model". In Kilian, Joe (ed.). Advances in Cryptology — CRYPTO 2001. Lecture Notes in Computer Science. Vol. 2139. Berlin, Heidelberg: Springer. pp. 542–565. doi: 10.1007/3-540-44647-8_32 . ISBN   978-3-540-44647-7.
  13. Dodis, Yevgeniy (2002). "Efficient Construction of (Distributed) Verifiable Random Functions". In Desmedt, Yvo G. (ed.). Public Key Cryptography — PKC 2003. Lecture Notes in Computer Science. Vol. 2567. Berlin, Heidelberg: Springer. pp. 1–17. doi: 10.1007/3-540-36288-6_1 . ISBN   978-3-540-36288-3.
  14. Goldberg, Sharon. "NSEC5: Provably Preventing DNSSEC Zone Enumeration". www.cs.bu.edu. Retrieved 2021-08-26.