Anti-information

Last updated

Information is that which reduces uncertainty, wholly or in part. Similarly, anti-information is that which increases uncertainty. It is negative information.

Noise on a noisy communication channel is an example of anti-information. According to Shannon's Channel Coding Theorem the entropy of the noise must be subtracted to obtain the channel capacity that remains available for reliable communication.

The gambling industry has made a business out of selling anti-information. People are willing to pay for anti-information. The increase in uncertainty enables them to savor the information that they subsequently receive when the uncertainty is finally resolved.[ citation needed ]

The term anti-information was introduced by Koos Verhoeff in the 1970s while teaching informatics at the Erasmus University Rotterdam.

Related Research Articles

Communication Act of conveying intended meaning

Communication is "an apparent answer to the painful divisions between self and other, private and public, and inner thought and outer word." As this definition indicates, communication is difficult to define in a consistent manner, because it is commonly used to refer to a wide range of different behaviors, or to limit what can be included in the category of communication. John Peters argues the difficulty of defining communication emerges from the fact that communication is both a universal phenomena, and a specific discipline of institutional academic study.

Digital data Discrete, discontinuous representation of information

Digital data, in information theory and information systems, is information represented as a string of discrete symbols each of which can take on one of only a finite number of values from some alphabet, such as letters or digits. An example is a text document, which consists of a string of alphanumeric characters. The most common form of digital data in modern information systems is binary data, which is represented by a string of binary digits (bits) each of which can have one of two values, either 0 or 1.

Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

Quantum information Information held in the state of a quantum system

Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

In general terms, throughput is the rate of production or the rate at which something is processed.

In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.

Communication channel transmission channel

A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used to convey an information signal, for example a digital bit stream, from one or several senders to one or several receivers. A channel has a certain capacity for transmitting information, often measured by its bandwidth in Hz or its data rate in bits per second.

Additive white Gaussian noise (AWGN) is a basic noise model used in information theory to mimic the effect of many random processes that occur in nature. The modifiers denote specific characteristics:

The last mile or last kilometer is a phrase widely used in the telecommunications, cable television and internet industries to refer to the final leg of the telecommunications networks that deliver telecommunication services to retail end-users (customers). More specifically, the last mile describes the portion of the telecommunications network chain that physically reaches the end-user's premises. Examples are the copper wire subscriber lines connecting landline telephones to the local telephone exchange; coaxial cable service drops carrying cable television signals from utility poles to subscribers' homes, and cell towers linking local cell phones to the cellular network. The word "mile" is used metaphorically; the length of the last mile link may be more or less than a mile. Because the last mile of a network to the user is conversely the first mile from the user's premises to the outside world when the user is sending data, the term first mile is also alternatively used.

Coding theory Study of the properties of codes and their fitness

Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, linguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods. This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data.

A cognitive radio (CR) is a radio that can be programmed and configured dynamically to use the best wireless channels in its vicinity to avoid user interference and congestion. Such a radio automatically detects available channels in wireless spectrum, then accordingly changes its transmission or reception parameters to allow more concurrent wireless communications in a given spectrum band at one location. This process is a form of dynamic spectrum management.

The uncertainty reduction theory, also known as initial interaction theory, developed in 1975 by Charles Berger and Richard Calabrese, is a communication theory from the post-positivist tradition. It is one of the only communication theories that specifically looks into the initial interaction between people prior to the actual communication process. The theory asserts the notion that, when interacting, people need information about the other party in order to reduce their uncertainty. In gaining this information people are able to predict the other's behavior and resulting actions, all of which according to the theory is crucial in the development of any relationship.

In telecommunications, an interference is that which modifies a signal in a disruptive manner, as it travels along a communication channel between its source and receiver. The term is often used to refer to the addition of unwanted signals to a useful signal. Common examples include:

In computing, telecommunication, information theory, and coding theory, an error correction code, sometimes error correcting code, (ECC) is used for controlling errors in data over unreliable or noisy communication channels. The central idea is the sender encodes the message with redundant information in the form of an ECC. The redundancy allows the receiver to detect a limited number of errors that may occur anywhere in the message, and often to correct these errors without retransmission. The American mathematician Richard Hamming pioneered this field in the 1940s and invented the first error-correcting code in 1950: the Hamming (7,4) code.

Shannon–Weaver model

The Shannon–Weaver model of communication has been called the "mother of all models." Social Scientists use the term to refer to an integrated model of the concepts of information source, message, transmitter, signal, channel, noise, receiver, information destination, probability of error, encoding, decoding, information rate, channel capacity. However, some consider the name to be misleading, asserting that the most significant ideas were developed by Shannon alone.

In computing, computer performance is the amount of useful work accomplished by a computer system. Outside of specific contexts, computer performance is estimated in terms of accuracy, efficiency and speed of executing computer program instructions. When it comes to high computer performance, one or more of the following factors might be involved:

Information facts provided or learned about something or someone

Information, in a general sense, is processed, organised and structured data. It provides context for data and enables decision making. For example, a single customer’s sale at a restaurant is data – this becomes information when the business is able to identify the most popular or least popular dish.

Interpersonal communication Exchange of information between two or more people who are interdependent

Interpersonal communication is an exchange of information between two or more people. It is also an area of research that seeks to understand how humans use verbal and nonverbal cues to accomplish a number of personal and relational goals.

Models of communication Conceptual model used to explain the human communication process

Models of communication are conceptual models used to explain the human communication process. The first major model for communication was developed in 1948 by Claude Shannon and published with an introduction by Warren Weaver for Bell Laboratories. Following the basic concept, communication is the process of sending and receiving messages or transferring information from one part (sender) to another (receiver).

Squeezed states of light quantum states light can be in

In quantum physics, light is in a squeezed state if its electric field strength Ԑ for some phases has a quantum uncertainty smaller than that of a coherent state. The term squeezing thus refers to a reduced quantum uncertainty. To obey Heisenberg's uncertainty relation, a squeezed state must also have phases at which the electric field uncertainty is anti-squeezed, i.e. larger than that of a coherent state. Since 2019, the gravitational-wave observatories LIGO and Virgo employ `squeezed' laser light, which has significantly increased the rate of observed gravitational-wave events.