A Mathematical Theory of Communication

Last updated

A Mathematical Theory of Communication
A Mathematical Theory of Communication.png
1949 full book edition
Author Claude E. Shannon
LanguageEnglish
Subject Communication theory
Publication date
1948
Publication placeUnited States

"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948. [1] [2] [3] [4] It was renamed The Mathematical Theory of Communication in the 1949 book of the same name, [5] a small but significant title change after realizing the generality of this work. It has tens of thousands of citations, being one of the most influential and cited scientific papers of all time, [6] as it gave rise to the field of information theory, with Scientific American referring to the paper as the "Magna Carta of the Information Age", [7] while the electrical engineer Robert G. Gallager called the paper a "blueprint for the digital era". [8] Historian James Gleick rated the paper as the most important development of 1948, placing the transistor second in the same time period, with Gleick emphasizing that the paper by Shannon was "even more profound and more fundamental" than the transistor. [9]

Contents

It is also noted that "as did relativity and quantum theory, information theory radically changed the way scientists look at the universe". [10] The paper also formally introduced the term "bit" and serves as its theoretical foundation. [11]

Publication

The article was the founding work of the field of information theory. It was later published in 1949 as a book titled The Mathematical Theory of Communication ( ISBN   0-252-72546-8), which was published as a paperback in 1963 ( ISBN   0-252-72548-4). The book contains an additional article by Warren Weaver, providing an overview of the theory for a more general audience. [12]

Contents

Shannon's diagram of a general communications system, showing the process by which a message sent becomes the message received (possibly corrupted by noise) Shannon communication system.svg
Shannon's diagram of a general communications system, showing the process by which a message sent becomes the message received (possibly corrupted by noise)

This work is known for introducing the concepts of channel capacity as well as the noisy channel coding theorem.

Shannon's article laid out the basic elements of communication:

It also developed the concepts of information entropy, redundancy and the source coding theorem, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information. It was also in this paper that the Shannon–Fano coding technique was proposed – a technique developed in conjunction with Robert Fano.

Related Research Articles

<span class="mw-page-title-main">Andrey Markov</span> Russian mathematician (1856–1922)

Andrey Andreyevich Markov was a Russian mathematician best known for his work on stochastic processes. A primary subject of his research later became known as the Markov chain. He was also a strong, close to master-level, chess player.

The bit is the most basic unit of information in computing and digital communication. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either "1" or "0", but other representations such as true/false, yes/no, on/off, or +/ are also widely used.

Basic English is a controlled language based on standard English, but with a greatly simplified vocabulary and grammar. It was created by the linguist and philosopher Charles Kay Ogden as an international auxiliary language, and as an aid for teaching English as a second language. It was presented in Ogden's 1930 book Basic English: A General Introduction with Rules and Grammar.

<span class="mw-page-title-main">Claude Shannon</span> American mathematician and information theorist (1916–2001)

Claude Elwood Shannon was an American mathematician, electrical engineer, computer scientist, cryptographer and inventor known as the "father of information theory" and as the "father of the Information Age". Shannon was the first to describe the Boolean gates that are essential to all digital electronic circuits, and was one of the founding fathers of artificial intelligence. Shannon is credited with laying the foundations of the Information Age.

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and put on a firm footing by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

<span class="mw-page-title-main">Quantum information</span> Information held in the state of a quantum system

Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

<span class="mw-page-title-main">Nyquist–Shannon sampling theorem</span> Sufficiency theorem for reconstructing signals from samples

The Nyquist–Shannon sampling theorem is an essential principle for digital signal processing linking the frequency range of a signal and the sample rate required to avoid a type of distortion called aliasing. The theorem states that the sample rate must be at least twice the bandwidth of the signal to avoid aliasing. In practice, it is used to select band-limiting filters to keep aliasing below an acceptable amount when an analog signal is sampled or when sample rates are changed within a digital signal processing function.

<span class="mw-page-title-main">Image compression</span> Reduction of image size to save storage and transmission costs

Image compression is a type of data compression applied to digital images, to reduce their cost for storage or transmission. Algorithms may take advantage of visual perception and the statistical properties of image data to provide superior results compared with generic data compression methods which are used for other digital data.

In the field of data compression, Shannon–Fano coding, named after Claude Shannon and Robert Fano, is one of two related techniques for constructing a prefix code based on a set of symbols and their probabilities.

<span class="mw-page-title-main">Richard Hamming</span> American mathematician and information theorist

Richard Wesley Hamming was an American mathematician whose work had many implications for computer engineering and telecommunications. His contributions include the Hamming code, the Hamming window, Hamming numbers, sphere-packing, Hamming graph concepts, and the Hamming distance.

<span class="mw-page-title-main">Communication theory</span> Proposed description of communication phenomena

Communication theory is a proposed description of communication phenomena, the relationships among them, a storyline describing these relationships, and an argument for these three elements. Communication theory provides a way of talking about and analyzing key events, processes, and commitments that together form communication. Theory can be seen as a way to map the world and make it navigable; communication theory gives us tools to answer empirical, conceptual, or practical communication questions.

<span class="mw-page-title-main">John Tukey</span> American mathematician

John Wilder Tukey was an American mathematician and statistician, best known for the development of the fast Fourier Transform (FFT) algorithm and box plot. The Tukey range test, the Tukey lambda distribution, the Tukey test of additivity, and the Teichmüller–Tukey lemma all bear his name. He is also credited with coining the term bit and the first published use of the word software.

"Communication Theory of Secrecy Systems" is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments of modern cryptography. His work has been described as a "turning point, and marked the closure of classical cryptography and the beginning of modern cryptography." It has also been described as turning cryptography from an "art to a science". It is also a proof that all theoretically unbreakable ciphers must have the same requirements as the one-time pad.

In information theory, the noisy-channel coding theorem, establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.

In computing, telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

<span class="mw-page-title-main">Shannon–Weaver model</span> Linear model of communication

The Shannon–Weaver model is one of the first and most influential models of communication. It was initially published in the 1948 paper "A Mathematical Theory of Communication" and explains communication in terms of five basic components: a source, a transmitter, a channel, a receiver, and a destination. The source produces the original message. The transmitter translates the message into a signal, which is sent using a channel. The receiver translates the signal back into the original message and makes it available to the destination. For a landline phone call, the person calling is the source. They use the telephone as a transmitter, which produces an electric signal that is sent through the wire as a channel. The person receiving the call is the destination and their telephone is the receiver.

The pragmatic theory of information is derived from Charles Sanders Peirce's general theory of signs and inquiry. Peirce explored a number of ideas about information throughout his career. One set of ideas is about the "laws of information" having to do with the logical properties of information. Another set of ideas about "time and thought" have to do with the dynamic properties of inquiry. All of these ideas contribute to the pragmatic theory of inquiry. Peirce set forth many of these ideas very early in his career, periodically returning to them on scattered occasions until the end, and they appear to be implicit in much of his later work on the logic of science and the theory of signs, but he never developed their implications to the fullest extent. The 20th century thinker Ernst Ulrich and his wife Christine von Weizsäcker reviewed the pragmatics of information; their work is reviewed by Gennert.

Ralph Beebe Blackman was an American mathematician and engineer who was among the pioneers of the Information Age along with Claude E. Shannon, Hendrik Wade Bode, and John Tukey.

The Bell Labs Technical Journal was the in-house scientific journal for scientists of Bell Labs, published yearly by the IEEE society.

In the field of data compression, Shannon coding, named after its creator, Claude Shannon, is a lossless data compression technique for constructing a prefix code based on a set of symbols and their probabilities. It is suboptimal in the sense that it does not achieve the lowest possible expected code word length like Huffman coding does, and never better than but sometimes equal to the Shannon–Fano coding.

References

  1. Shannon, Claude Elwood (July 1948). "A Mathematical Theory of Communication" (PDF). Bell System Technical Journal . 27 (3): 379–423. doi:10.1002/j.1538-7305.1948.tb01338.x. hdl: 11858/00-001M-0000-002C-4314-2 . Archived from the original (PDF) on 1998-07-15. The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits , a word suggested by J. W. Tukey.
  2. Shannon, Claude Elwood (October 1948). "A Mathematical Theory of Communication". Bell System Technical Journal . 27 (4): 623–656. doi:10.1002/j.1538-7305.1948.tb00917.x. hdl: 11858/00-001M-0000-002C-4314-2 .
  3. Ash, Robert B. (1966). Information Theory: Tracts in Pure & Applied Mathematics. New York: John Wiley & Sons Inc. ISBN   0-470-03445-9.
  4. Yeung, Raymond W. (2008). "The Science of Information". Information Theory and Network Coding . Springer. pp.  1–4. doi:10.1007/978-0-387-79234-7_1. ISBN   978-0-387-79233-0.
  5. Shannon, Claude Elwood; Weaver, Warren (1949). The Mathematical Theory of Communication (PDF). University of Illinois Press. ISBN   0-252-72548-4. Archived from the original (PDF) on 1998-07-15.
  6. Yan, Zheng (2020). Publishing Journal Articles: A Scientific Guide for New Authors Worldwide. Cambridge New York, NY Port Melbourne New Delhi: Cambridge University Press. p. 7. ISBN   978-1-108-27742-6.
  7. Goodman, Rob; Soni, Jimmy (2018). "Genius in Training". Alumni Association of the University of Michigan. Retrieved 2023-10-31.
  8. "Claude Shannon: Reluctant Father of the Digital Age". MIT Technology Review. 2001-07-01. Retrieved 2024-06-26.
  9. Gleick, James (2011). The Information: A History, a Theory, a Flood (1st ed.). New York: Vintage Books. pp. 3–4. ISBN   978-1-4000-9623-7.
  10. Watson, Peter (2018). Convergence: The Idea at the Heart of Science. New York London Toronto Sydney New Delhi: Simon & Schuster. p. 392. ISBN   978-1-4767-5434-5.
  11. Nicolelis, Miguel A. L. (2020). The True Creator of Everything: How the Human Brain Shaped the Universe as We Know it. New Haven: Yale University Press. p. 34. ISBN   978-0-300-24463-2. OCLC   1090423259.
  12. "The Mathematical Theory of Communication" (PDF). Monoskop Digital Libraries. Retrieved 2024-05-28.