In mathematics, computer science, telecommunication, information theory, and searching theory, error-correcting codes with feedback are error correcting codes designed to work in the presence of feedback from the receiver to the sender. [1]
Alice (the sender) wishes to send a value x to Bob (the receiver). The communication channel between Alice and Bob is imperfect, and can introduce errors.
An error-correcting code is a way of encoding x as a message such that Bob will successfully understand the value x as intended by Alice, even if the message Alice sends and the message Bob receives differ. In an error-correcting code with feedback, the channel is two-way: Bob can send feedback to Alice about the message he received.
In an error-correcting code without noisy feedback, the feedback received by the sender is always free of errors. In an error-correcting code with noisy feedback, errors can occur in the feedback, as well as in the message.
An error-correcting code with noiseless feedback is equivalent to an adaptive search strategy with errors. [1]
In 1956, Claude Shannon introduced the discrete memoryless channel with noiseless feedback. In 1961, Alfréd Rényi introduced the Bar-Kochba game (also known as Twenty questions), with a given percentage of wrong answers, and calculated the minimum number of randomly chosen questions to determine the answer.
In his 1964 dissertation, Elwyn Berlekamp considered error correcting codes with noiseless feedback. [2] [3] In Berlekamp's scenario, the receiver chose a subset of possible messages and asked the sender whether the given message was in this subset, a 'yes' or 'no' answer. Based on this answer, the receiver then chose a new subset and repeated the process. The game is further complicated due to noise; some of the answers will be wrong.
In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communication channels. Many communication channels are subject to channel noise, and thus errors may be introduced during transmission from the source to a receiver. Error detection techniques allow detecting such errors, while error correction enables reconstruction of the original data in many cases.
Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.
Chaffing and winnowing is a cryptographic technique to achieve confidentiality without using encryption when sending data over an insecure channel. The name is derived from agriculture: after grain has been harvested and threshed, it remains mixed together with inedible fibrous chaff. The chaff and grain are then separated by winnowing, and the chaff is discarded. The cryptographic technique was conceived by Ron Rivest and published in an on-line article on 18 March 1998. Although it bears similarities to both traditional encryption and steganography, it cannot be classified under either category.
A binary symmetric channel is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit, and the receiver will receive a bit. The bit will be "flipped" with a "crossover probability" of p, and otherwise is received correctly. This model can be applied to varied communication channels such as telephone lines or disk drive storage.
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, linguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods. This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data.
In cryptography, an oblivious transfer (OT) protocol is a type of protocol in which a sender transfers one of potentially many pieces of information to a receiver, but remains oblivious as to what piece has been transferred.
Elwyn Ralph Berlekamp was a professor of mathematics and computer science at the University of California, Berkeley. Berlekamp was widely known for his work in computer science, coding theory and combinatorial game theory.
A cryptosystem is considered to have information-theoretic security if the system is secure against adversaries with unlimited computing resources and time. In contrast, a system which depends on the computational cost of cryptanalysis to be secure is called computationally, or conditionally, secure.
In coding theory, an erasure code is a forward error correction (FEC) code under the assumption of bit erasures, which transforms a message of k symbols into a longer message with n symbols such that the original message can be recovered from a subset of the n symbols. The fraction r = k/n is called the code rate. The fraction k’/k, where k’ denotes the number of symbols required for recovery, is called reception efficiency.
In data communications, flow control is the process of managing the rate of data transmission between two nodes to prevent a fast sender from overwhelming a slow receiver. Flow control should be distinguished from congestion control, which is used for controlling the flow of data when congestion has actually occurred. Flow control mechanisms can be classified by whether or not the receiving node sends feedback to the sending node.
In coding theory, decoding is the process of translating received messages into codewords of a given code. There have been many common methods of mapping messages to codewords. These are often used to recover messages sent over a noisy channel, such as a binary symmetric channel.
In information theory, the noisy-channel coding theorem, establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.
A timeline of events related to information theory, quantum information theory and statistical physics, data compression, error correcting codes and related subjects.
Hybrid automatic repeat request is a combination of high-rate forward error correction (FEC) and automatic repeat request (ARQ) error-control. In standard ARQ, redundant bits are added to data to be transmitted using an error-detecting (ED) code such as a cyclic redundancy check (CRC). Receivers detecting a corrupted message will request a new message from the sender. In Hybrid ARQ, the original data is encoded with an FEC code, and the parity bits are either immediately sent along with the message or only transmitted upon request when a receiver detects an erroneous message. The ED code may be omitted when a code is used that can perform both forward error correction (FEC) in addition to error detection, such as a Reed–Solomon code. The FEC code is chosen to correct an expected subset of all errors that may occur, while the ARQ method is used as a fall-back to correct errors that are uncorrectable using only the redundancy sent in the initial transmission. As a result, hybrid ARQ performs better than ordinary ARQ in poor signal conditions, but in its simplest form this comes at the expense of significantly lower throughput in good signal conditions. There is typically a signal quality cross-over point below which simple hybrid ARQ is better, and above which basic ARQ is better.
In coding theory and information theory, a binary erasure channel (BEC) is a communications channel model. A transmitter sends a bit, and the receiver either receives the bit correctly, or with some probability receives a message that the bit was not received ("erased").
Entanglement distillation is the transformation of N copies of an arbitrary entangled state into some number of approximately pure Bell pairs, using only local operations and classical communication.
In the theory of quantum communication, the entanglement-assisted stabilizer formalism is a method for protecting quantum information with the help of entanglement shared between a sender and receiver before they transmit quantum data over a quantum communication channel. It extends the standard stabilizer formalism by including shared entanglement . The advantage of entanglement-assisted stabilizer codes is that the sender can exploit the error-correcting properties of an arbitrary set of Pauli operators. The sender's Pauli operators do not necessarily have to form an Abelian subgroup of the Pauli group over qubits. The sender can make clever use of her shared ebits so that the global stabilizer is Abelian and thus forms a valid quantum error-correcting code.
In the theory of quantum communication, the entanglement-assisted classical capacity of a quantum channel is the highest rate at which classical information can be transmitted from a sender to receiver when they share an unlimited amount of noiseless entanglement. It is given by the quantum mutual information of the channel, which is the input-output quantum mutual information maximized over all pure bipartite quantum states with one system transmitted through the channel. This formula is the natural generalization of Shannon's noisy channel coding theorem, in the sense that this formula is equal to the capacity, and there is no need to regularize it. An additional feature that it shares with Shannon's formula is that a noiseless classical or quantum feedback channel cannot increase the entanglement-assisted classical capacity. The entanglement-assisted classical capacity theorem is proved in two parts: the direct coding theorem and the converse theorem. The direct coding theorem demonstrates that the quantum mutual information of the channel is an achievable rate, by a random coding strategy that is effectively a noisy version of the super-dense coding protocol. The converse theorem demonstrates that this rate is optimal by making use of the strong subadditivity of quantum entropy.
Quantum secret sharing (QSS) is a quantum cryptographic scheme for secure communication that extends beyond simple quantum key distribution. It modifies the classical secret sharing (CSS) scheme by using quantum information and the no-cloning theorem to attain the ultimate security for communications.