In cryptographic protocol design, cryptographic agility or crypto-agility is the ability to switch between multiple cryptographic primitives.
A cryptographically agile system implementing a particular standard can choose which combination of primitives to use. The primary goal of cryptographic agility is to enable rapid adaptations of new cryptographic primitives and algorithms without making disruptive changes to the system's infrastructure.
Cryptographic agility acts as a safety measure or an incident response mechanism for when a cryptographic primitive of a system is discovered to be vulnerable. [1] A security system is considered crypto-agile if its cryptographic algorithms or parameters can be replaced with ease and is at least partly automated. [2] [3] The impending arrival of a quantum computer that can break existing asymmetric cryptography is raising awareness of the importance of cryptographic agility. [4] [5]
The X.509 public key certificate illustrates crypto-agility. A public key certificate has cryptographic parameters including key type, key length, and a hash algorithm. X.509 version v.3, with key type RSA, a 1024-bit key length, and the SHA-1 hash algorithm were found by NIST to have a key length that made it vulnerable to attacks, thus prompting the transition to SHA-2. [6]
With the rise of secure transport layer communication in the end of the 1990s, cryptographic primitives and algorithms have been increasingly popular; for example, by 2019, more than 80% of all websites employed some form of security measures. [7] Furthermore, cryptographic techniques are widely incorporated to protect applications and business transactions.
However, as cryptographic algorithms are deployed, research of their security intensifies, and new attacks against cryptographic primitives (old and new alike) are discovered. Crypto-agility tries to tackle the implied threat to information security by allowing swift deprecation of vulnerable primitives and replacement with new ones.
This threat is not merely theoretical; many algorithms that were once considered secure (DES, 512-bit RSA, RC4) are now known to be vulnerable, some even to amateur attackers. On the other hand, new algorithms (AES, Elliptic curve cryptography) are often both more secure and faster in comparison to old ones. Systems designed to meet crypto-agility criteria are expected to be less affected should current primitives be found vulnerable, and may enjoy better latency or battery usage by using new and improved primitives.
For example, quantum computing, if feasible, is expected to be able to defeat existing public key cryptography algorithms. The overwhelming majority of existing public-key infrastructure relies on the computational hardness of problems such as integer factorization and discrete logarithms (which includes elliptic-curve cryptography as a special case). Quantum computers running Shor's algorithm can solve these problems exponentially faster than the best-known algorithms for conventional computers. [8] Post-quantum cryptography is the subfield of cryptography that aims to replace quantum-vulnerable algorithms with new ones that are believed hard to break even for a quantum computer. The main families of post-quantum alternatives to factoring and discrete logarithms include lattice-based cryptography, multivariate cryptography, hash-based cryptography, and code-based cryptography.
System evolution and crypto-agility are not the same. System evolution progresses on the basis of emerging business and technical requirements. Crypto-agility is related instead to computing infrastructure and requires consideration by security experts, system designers, and application developers. [9]
Best practices about dealing with crypto-agility include: [10]
In cryptography, key size or key length refers to the number of bits in a key used by a cryptographic algorithm.
Elliptic-curve cryptography (ECC) is an approach to public-key cryptography based on the algebraic structure of elliptic curves over finite fields. ECC allows smaller keys to provide equivalent security, compared to cryptosystems based on modular exponentiation in Galois fields, such as the RSA cryptosystem and ElGamal cryptosystem.
In cryptography, encryption is the process of transforming information in a way that, ideally, only authorized parties can decode. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Despite its goal, encryption does not itself prevent interference but denies the intelligible content to a would-be interceptor.
In cryptography, SHA-1 is a hash function which takes an input and produces a 160-bit (20-byte) hash value known as a message digest – typically rendered as 40 hexadecimal digits. It was designed by the United States National Security Agency, and is a U.S. Federal Information Processing Standard. The algorithm has been cryptographically broken but is still widely used.
A digital signature is a mathematical scheme for verifying the authenticity of digital messages or documents. A valid digital signature on a message gives a recipient confidence that the message came from a sender known to the recipient.
A cryptographically secure pseudorandom number generator (CSPRNG) or cryptographic pseudorandom number generator (CPRNG) is a pseudorandom number generator (PRNG) with properties that make it suitable for use in cryptography. It is also referred to as a cryptographic random number generator (CRNG).
Daniel Julius Bernstein is an American mathematician, cryptologist, and computer scientist. He is a visiting professor at CASA at Ruhr University Bochum, as well as a research professor of Computer Science at the University of Illinois at Chicago. Before this, he was a visiting professor in the department of mathematics and computer science at the Eindhoven University of Technology.
Articles related to cryptography include:
A cryptographic hash function (CHF) is a hash algorithm that has special properties desirable for a cryptographic application:
In cryptography, a message authentication code (MAC), sometimes known as an authentication tag, is a short piece of information used for authenticating and integrity-checking a message. In other words, to confirm that the message came from the stated sender and has not been changed. The MAC value allows verifiers to detect any changes to the message content.
Key management refers to management of cryptographic keys in a cryptosystem. This includes dealing with the generation, exchange, storage, use, crypto-shredding (destruction) and replacement of keys. It includes cryptographic protocol design, key servers, user procedures, and other relevant protocols.
NSA Suite B Cryptography was a set of cryptographic algorithms promulgated by the National Security Agency as part of its Cryptographic Modernization Program. It was to serve as an interoperable cryptographic base for both unclassified information and most classified information.
Cryptographic primitives are well-established, low-level cryptographic algorithms that are frequently used to build cryptographic protocols for computer security systems. These routines include, but are not limited to, one-way hash functions and encryption functions.
Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice-based constructions support important standards of post-quantum cryptography. Unlike more widely used and known public-key schemes such as the RSA, Diffie-Hellman or elliptic-curve cryptosystems — which could, theoretically, be defeated using Shor's algorithm on a quantum computer — some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
Cryptography, or cryptology, is the practice and study of techniques for secure communication in the presence of adversarial behavior. More generally, cryptography is about constructing and analyzing protocols that prevent third parties or the public from reading private messages. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, information security, electrical engineering, digital signal processing, physics, and others. Core concepts related to information security are also central to cryptography. Practical applications of cryptography include electronic commerce, chip-based payment cards, digital currencies, computer passwords, and military communications.
The following outline is provided as an overview of and topical guide to cryptography:
Post-quantum cryptography (PQC), sometimes referred to as quantum-proof, quantum-safe, or quantum-resistant, is the development of cryptographic algorithms that are thought to be secure against a cryptanalytic attack by a quantum computer. Most widely-used public-key algorithms rely on the difficulty of one of three mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems could be easily solved on a sufficiently powerful quantum computer running Shor's algorithm or even faster and less demanding alternatives.
In cryptography, security level is a measure of the strength that a cryptographic primitive — such as a cipher or hash function — achieves. Security level is usually expressed as a number of "bits of security", where n-bit security means that the attacker would have to perform 2n operations to break it, but other methods have been proposed that more closely model the costs for an attacker. This allows for convenient comparison between algorithms and is useful when combining multiple primitives in a hybrid cryptosystem, so there is no clear weakest link. For example, AES-128 is designed to offer a 128-bit security level, which is considered roughly equivalent to a RSA using 3072-bit key.
Post-Quantum Cryptography Standardization is a program and competition by NIST to update their standards to include post-quantum cryptography. It was announced at PQCrypto 2016. 23 signature schemes and 59 encryption/KEM schemes were submitted by the initial submission deadline at the end of 2017 of which 69 total were deemed complete and proper and participated in the first round. Seven of these, of which 3 are signature schemes, have advanced to the third round, which was announced on July 22, 2020.
This is a list of cybersecurity information technology. Cybersecurity is security as it is applied to information technology. This includes all technology that stores, manipulates, or moves data, such as computers, data networks, and all devices connected to or included in networks, such as routers and switches. All information technology devices and facilities need to be secured against intrusion, unauthorized use, and vandalism. Additionally, the users of information technology should be protected from theft of assets, extortion, identity theft, loss of privacy and confidentiality of personal information, malicious mischief, damage to equipment, business process compromise, and the general activity of cybercriminals. The public should be protected against acts of cyberterrorism, such as the compromise or loss of the electric power grid.