Units of information |
Information-theoretic |
---|
Data storage |
Quantum information |
The shannon (symbol: Sh) is a unit of information named after Claude Shannon, the founder of information theory. IEC 80000-13 defines the shannon as the information content associated with an event when the probability of the event occurring is 1/2. It is understood as such within the realm of information theory, and is conceptually distinct from the bit, a term used in data processing and storage to denote a single instance of a binary signal. A sequence of n binary symbols (such as contained in computer memory or a binary data transmission) is properly described as consisting of n bits, but the information content of those n symbols may be more or less than n shannons depending on the a priori probability of the actual sequence of symbols. [1]
The shannon also serves as a unit of the information entropy of an event, which is defined as the expected value of the information content of the event (i.e., the probability-weighted average of the information content of all potential events). Given a number of possible outcomes, unlike information content, the entropy has an upper bound, which is reached when the possible outcomes are equiprobable. The maximum entropy of n bits is n Sh. A further quantity that it is used for is channel capacity, which is generally the maximum of the expected value of the information content encoded over a channel that can be transferred with negligible probability of error, typically in the form of an information rate.
Nevertheless, the term bits of information or simply bits is more often heard, even in the fields of information and communication theory, rather than shannons; just saying bits can therefore be ambiguous. Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data, [2] whereas "bits" can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing.
The shannon is connected through constants of proportionality to two other units of information: [3]
The hartley , a seldom-used unit, is named after Ralph Hartley, an electronics engineer interested in the capacity of communications channels. Although of a more limited nature, his early work, preceding that of Shannon, makes him recognized also as a pioneer of information theory. Just as the shannon describes the maximum possible information capacity of a binary symbol, the hartley describes the information that can be contained in a 10-ary symbol, that is, a digit value in the range 0 to 9 when the a priori probability of each value is 1/10. The conversion factor quoted above is given by log10(2).
In mathematical expressions, the nat is a more natural unit of information, but 1 nat does not correspond to a case in which all possibilities are equiprobable, unlike with the shannon and hartley. In each case, formulae for the quantification of information capacity or entropy involve taking the logarithm of an expression involving probabilities. If base-2 logarithms are employed, the result is expressed in shannons, if base-10 (common logarithms) then the result is in hartleys, and if natural logarithms (base e), the result is in nats. For instance, the information capacity of a 16-bit sequence (achieved when all 65536 possible sequences are equally probable) is given by log(65536), thus log10(65536) Hart ≈ 4.82 Hart, loge(65536) nat ≈ 11.09 nat, or log2(65536) Sh = 16 Sh.
In information theory and derivative fields such as coding theory, one cannot quantify the 'information' in a single message (sequence of symbols) out of context, but rather a reference is made to the model of a channel (such as bit error rate) or to the underlying statistics of an information source. There are thus various measures of or related to information, all of which may use the shannon as a unit.[ citation needed ]
For instance, in the above example, a 16-bit channel could be said to have a channel capacity of 16 Sh, but when connected to a particular information source that only sends one of 8 possible messages, one would compute the entropy of its output as no more than 3 Sh. And if one already had been informed through a side channel in which set of 4 possible messages the message is, then one could calculate the mutual information of the new message (having 8 possible states) as no more than 2 Sh. Although there are infinite possibilities for a real number chosen between 0 and 1, so-called differential entropy can be used to quantify the information content of an analog signal, such as related to the enhancement of signal-to-noise ratio or confidence of a hypothesis test.[ citation needed ]
The bit is the most basic unit of information in computing and digital communication. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either "1" or "0", but other representations such as true/false, yes/no, on/off, or +/− are also widely used.
In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly used for lossless data compression. The process of finding or using such a code is Huffman coding, an algorithm developed by David A. Huffman while he was a Sc.D. student at MIT, and published in the 1952 paper "A Method for the Construction of Minimum-Redundancy Codes".
Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and put on a firm footing by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.
In information theory, the entropy of a random variable quantifies the average level of uncertainty or information associated with the variable's potential states or possible outcomes. This measures the expected amount of information needed to describe the state of the variable, considering the distribution of probabilities across all potential states. Given a discrete random variable , which takes values in the set and is distributed according to , the entropy is where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits, while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys". An equivalent definition of entropy is the expected value of the self-information of a variable.
In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.
A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used for information transfer of, for example, a digital bit stream, from one or several senders to one or several receivers. A channel has a certain capacity for transmitting information, often measured by its bandwidth in Hz or its data rate in bits per second.
Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel.
In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory.
In information theory, Shannon's source coding theorem establishes the statistical limits to possible data compression for data whose source is an independent identically-distributed random variable, and the operational meaning of the Shannon entropy.
In information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value . Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity.
The natural unit of information, sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
In information theory, the noisy-channel coding theorem, establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.
A timeline of events related to information theory, quantum information theory and statistical physics, data compression, error correcting codes and related subjects.
In information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process with probability of one of two values, and is given by the formula:
The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.
The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, or more correctly the shannon, based on the binary logarithm. Although "bit" is more frequently used in place of "shannon", its name is not distinguished from the bit as used in data-processing to refer to a binary value or stream regardless of its entropy Other units include the nat, based on the natural logarithm, and the hartley, based on the base 10 or common logarithm.
ISO/IEC 80000, Quantities and units, is an international standard describing the International System of Quantities (ISQ). It was developed and promulgated jointly by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). It serves as a style guide for using physical quantities and units of measurement, formulas involving them, and their corresponding units, in scientific and educational documents for worldwide use. The ISO/IEC 80000 family of standards was completed with the publication of the first edition of Part 1 in November 2009.
Information is an abstract concept that refers to something which has the power to inform. At the most fundamental level, it pertains to the interpretation of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artifacts such as analogue signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form. Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation.
The hartley, also called a ban, or a dit, is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is 1⁄10. It is therefore equal to the information contained in one decimal digit, assuming a priori equiprobability of each possible value. It is named after Ralph Hartley.
The Système International d'unités recommends the use of the shannon (Sh) as the information unit in place of the bit to distinguish the amount of information from the quantity of data that may be used to represent this information. Thus, according to the SI standard, H(X) should actually be expressed in shannons. The entropy of one bit lies between 0 and 1 Sh.