Key finding attacks

Last updated

Key Finding Attacks are attacks on computer systems that make use of cryptography in which computer memory or non-volatile storage is searched for private cryptographic keys that can be used to decrypt or sign data. The term is generally used in the context of attacks which search memory much more efficiently than simply testing each sequence of bytes to determine if it provides the correct answer. They are often used in combination with cold boot attacks to extract key material from computers.

Contents

Approaches

In their seminal paper [1] on Key Finding attacks, Shamir and van Someren proposed two different approaches to key finding: statistical or entropic key finding and analytical key finding. The former relies on detecting differences in the statistical properties of the data that make up cryptographic keys while the later relies on determining specific byte patterns that must necessarily exist in the target key material and looking for these patterns.

Statistical key finding

In general for most cryptographic systems the cryptographic keys should be as random as possible. For most symmetric ciphers the keys can and should be a truly random set of bits. For most asymmetric ciphers the private keys are either numbers chosen at random with certain constraints (such as primality or being generators in a group) or are the result of computations based on a set of random numbers with some constraints. In either case the key material exhibits high entropy. In contrast to this, most uncompressed data in a computer's memory has relatively low entropy. As a result, if a key is known to exist in memory in its raw form then it is likely to stand out against the background of non-key data by virtue of its high entropy and an attacker needs to only test for matching keys in areas of memory or storage that have high entropy.

High entropy keys stand out visually against background low-entropy data. High entropy keys.png
High entropy keys stand out visually against background low-entropy data.

The contrast between the low entropy of most data and the high entropy of key data is sufficient as to be apparent by visual inspection. The image to the right shows an example of this.

Analytical key finding

While statistical key finding can be effective for reducing the amount of memory that needs to be searched, it still requires high-entropy areas to be tested to check if they contain the correct key material. In certain cases, particularly in the context of public key encryption systems, it is possible to determine patterns that must occur in the key material and then limit the search to areas where these patterns are found.

Shamir and van Someren [1] demonstrated one example of this analytical approach for finding private RSA keys where the public key is known and has a small public exponent. In the RSA system the public key is a pair , where with p and q being two large primes. The corresponding private key is (or sometimes or some variant thereof) where , which is to say that e multiplied by d is equivalent to 1, modulo where φ represents Euler's totient function and is the size of the multiplicative group modulo n. In the case of an RSA key:

Finding the value of from n allows for the factorization of n and the security of the RSA cryptosystem rests on the difficulty of doing so. As such an attacker cannot determine d exactly, given e and n. An attack can however know a fair amount about what d looks like, given the knowledge that p and q are typically chosen to be the same length in bits and are both 'close' to the square root of n. Thus an attacker can approximate a guess of:

and typically this approximation will be correct in the more significant half of its bits of its binary representation. The relationship of e and d means that:

where the exact value of k is unknown but Using this fact and the approximation , the attacker can enumerate a set of possible values for the top half of the binary representation of d for each possible value of k. These binary patterns can be tested for many orders of magnitude faster than performing a trail decryption. Furthermore, in the common case of it can be shown that which allows the top half of the bits of d to be determined exactly and searched for directly.

Application

Key finding attacks have been used in conjunction with cold boot attacks to extract keys from machines after they have been switched off. [2] Heninger and Shacham showed that keys can be extracted even when the data in memory has been corrupted by having the power removed. [3]

Statistical key finding was used by Nicko van Someren to locate the signature verification keys used by Microsoft to validate the signatures on MS-CAPI plug-ins. One of these key was later discovered to be referred to as the NSAKEY by Microsoft, sparking some controversy. [4]

Mitigations

Key finding attacks can be mitigated in several ways. For analytic attacks, randomized key blinding will prevent the expected patterns from being found in memory as well as protecting against some other sorts of side-channel attack. Statistical attacks can be made less effective by storing other sorts of high-entropy or compressed data in memory and key material can be spread over a larger block of memory when not in use to reduce the concentration of entropy in one place.

Related Research Articles

In cryptography, a block cipher is a deterministic algorithm that operates on fixed-length groups of bits, called blocks. Block ciphers are the elementary building blocks of many cryptographic protocols. They are ubiquitous in the storage and exchange of data, where such data is secured and authenticated via encryption.

Elliptic-curve cryptography (ECC) is an approach to public-key cryptography based on the algebraic structure of elliptic curves over finite fields. ECC allows smaller keys compared to non-EC cryptography to provide equivalent security.

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

RSA (Rivest–Shamir–Adleman) is a public-key cryptosystem, one of the oldest that is widely used for secure data transmission. The initialism "RSA" comes from the surnames of Ron Rivest, Adi Shamir and Leonard Adleman, who publicly described the algorithm in 1977. An equivalent system was developed secretly in 1973 at Government Communications Headquarters (GCHQ), the British signals intelligence agency, by the English mathematician Clifford Cocks. That system was declassified in 1997.

Shor's algorithm is a quantum algorithm for finding the prime factors of an integer. It was developed in 1994 by the American mathematician Peter Shor. It is one of the few known quantum algorithms with compelling potential applications and strong evidence of superpolynomial speedup compared to best known classical algorithms. On the other hand, factoring numbers of practical significance requires far more qubits than available in the near future. Another concern is that noise in quantum circuits may undermine results, requiring additional qubits for quantum error correction.

The Merkle–Hellman knapsack cryptosystem was one of the earliest public key cryptosystems. It was published by Ralph Merkle and Martin Hellman in 1978. A polynomial time attack was published by Adi Shamir in 1984. As a result, the cryptosystem is now considered insecure.

A cryptographically secure pseudorandom number generator (CSPRNG) or cryptographic pseudorandom number generator (CPRNG) is a pseudorandom number generator (PRNG) with properties that make it suitable for use in cryptography. It is also loosely known as a cryptographic random number generator (CRNG).

<span class="mw-page-title-main">Trapdoor function</span> One-way cryptographic tool

In theoretical computer science and cryptography, a trapdoor function is a function that is easy to compute in one direction, yet difficult to compute in the opposite direction without special information, called the "trapdoor". Trapdoor functions are a special case of one-way functions and are widely used in public-key cryptography.

In cryptography, the Elliptic Curve Digital Signature Algorithm (ECDSA) offers a variant of the Digital Signature Algorithm (DSA) which uses elliptic-curve cryptography.

In cryptography, a weak key is a key, which, used with a specific cipher, makes the cipher behave in some undesirable way. Weak keys usually represent a very small fraction of the overall keyspace, which usually means that, a cipher key made by random number generation is very unlikely to give rise to a security problem. Nevertheless, it is considered desirable for a cipher to have no weak keys. A cipher with no weak keys is said to have a flat, or linear, key space.

In cryptography, a verifiable random function (VRF) is a public-key pseudorandom function that provides proofs that its outputs were calculated correctly. The owner of the secret key can compute the function value as well as an associated proof for any input value. Everyone else, using the proof and the associated public key, can check that this value was indeed calculated correctly, yet this information cannot be used to find the secret key.

The Benaloh Cryptosystem is an extension of the Goldwasser-Micali cryptosystem (GM) created in 1985 by Josh (Cohen) Benaloh. The main improvement of the Benaloh Cryptosystem over GM is that longer blocks of data can be encrypted at once, whereas in GM each bit is encrypted individually.

CEILIDH is a public key cryptosystem based on the discrete logarithm problem in algebraic torus. This idea was first introduced by Alice Silverberg and Karl Rubin in 2003; Silverberg named CEILIDH after her cat. The main advantage of the system is the reduced size of the keys for the same security over basic schemes.

In cryptography, Learning with errors (LWE) is a mathematical problem that is widely used in cryptography to create secure encryption algorithms. It is based on the idea of representing secret information as a set of equations with errors. In other words, LWE is a way to hide the value of a secret by introducing noise to it. In more technical terms, it refers to the computational problem of inferring a linear -ary function over a finite ring from given samples some of which may be erroneous. The LWE problem is conjectured to be hard to solve, and thus to be useful in cryptography.

The Wiener's attack, named after cryptologist Michael J. Wiener, is a type of cryptographic attack against RSA. The attack uses the continued fraction method to expose the private key d when d is small.

Supersingular isogeny Diffie–Hellman key exchange is an insecure proposal for a post-quantum cryptographic algorithm to establish a secret key between two parties over an untrusted communications channel. It is analogous to the Diffie–Hellman key exchange, but is based on walks in a supersingular isogeny graph and was designed to resist cryptanalytic attack by an adversary in possession of a quantum computer. Before it was broken, SIDH boasted one of the smallest key sizes of all post-quantum key exchanges; with compression, SIDH used 2688-bit public keys at a 128-bit quantum security level. SIDH also distinguishes itself from similar systems such as NTRU and Ring-LWE by supporting perfect forward secrecy, a property that prevents compromised long-term keys from compromising the confidentiality of old communication sessions. These properties seemed to make SIDH a natural candidate to replace Diffie–Hellman (DHE) and elliptic curve Diffie–Hellman (ECDHE), which are widely used in Internet communication. However, SIDH is vulnerable to a devastating key-recovery attack published in July 2022 and is therefore insecure. The attack does not require a quantum computer.

Digital signatures are a means to protect digital information from intentional modification and to authenticate the source of digital information. Public key cryptography provides a rich set of different cryptographic algorithms the create digital signatures. However, the primary public key signatures currently in use will become completely insecure if scientists are ever able to build a moderately sized quantum computer. Post quantum cryptography is a class of cryptographic algorithms designed to be resistant to attack by a quantum cryptography. Several post quantum digital signature algorithms based on hard problems in lattices are being created replace the commonly used RSA and elliptic curve signatures. A subset of these lattice based scheme are based on a problem known as Ring learning with errors. Ring learning with errors based digital signatures are among the post quantum signatures with the smallest public key and signature sizes

In post-quantum cryptography, ring learning with errors (RLWE) is a computational problem which serves as the foundation of new cryptographic algorithms, such as NewHope, designed to protect against cryptanalysis by quantum computers and also to provide the basis for homomorphic encryption. Public-key cryptography relies on construction of mathematical problems that are believed to be hard to solve if no further information is available, but are easy to solve if some information used in the problem construction is known. Some problems of this sort that are currently used in cryptography are at risk of attack if sufficiently large quantum computers can ever be built, so resistant problems are sought. Homomorphic encryption is a form of encryption that allows computation on ciphertext, such as arithmetic on numeric values stored in an encrypted database.

In cryptography, a public key exchange algorithm is a cryptographic algorithm which allows two parties to create and share a secret key, which they can use to encrypt messages between themselves. The ring learning with errors key exchange (RLWE-KEX) is one of a new class of public key exchange algorithms that are designed to be secure against an adversary that possesses a quantum computer. This is important because some public key algorithms in use today will be easily broken by a quantum computer if such computers are implemented. RLWE-KEX is one of a set of post-quantum cryptographic algorithms which are based on the difficulty of solving certain mathematical problems involving lattices. Unlike older lattice based cryptographic algorithms, the RLWE-KEX is provably reducible to a known hard problem in lattices.

In cryptography, FourQ is an elliptic curve developed by Microsoft Research. It is designed for key agreements schemes and digital signatures (Schnorr), and offers about 128 bits of security. It is equipped with a reference implementation made by the authors of the original paper. The open source implementation is called FourQlib and runs on Windows and Linux and is available for x86, x64, and ARM. It is licensed under the MIT License and the source code is available on GitHub.

References

  1. 1 2 Shamir, Adi; van Someren, Nicko (1998-01-01). Playing Hide and Seek With Stored Keys. Lecture Notes in Computer Science. pp. 118–124. CiteSeerX   10.1.1.40.4467 .
  2. Halderman, J. Alex; Schoen, Seth D.; Heninger, Nadia; Clarkson, William; Paul, William; Cal, Joseph A.; Feldman, Ariel J.; Felten, Edward W. (2008-01-01). "Least we remember: Cold boot attacks on encryption keys". In USENIX Security Symposium.
  3. Heninger, Nadia; Shacham, Hovav (2009-01-01). "Reconstructing rsa private keys from random key bits". Proceedings of Crypto 2009. pp. 1–17. CiteSeerX   10.1.1.215.6281 .
  4. "Microsoft/NSA Info". 2000-06-17. Archived from the original on 2000-06-17. Retrieved 2016-10-12.{{cite web}}: CS1 maint: bot: original URL status unknown (link)