IEEE Transactions on Information Theory

Last updated

Related Research Articles

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

A cryptosystem is considered to have information-theoretic security if the system is secure against adversaries with unlimited computing resources and time. In contrast, a system which depends on the computational cost of cryptanalysis to be secure is called computationally, or conditionally, secure.

<span class="mw-page-title-main">Snake-in-the-box</span>

The snake-in-the-box problem in graph theory and computer science deals with finding a certain kind of path along the edges of a hypercube. This path starts at one corner and travels along the edges to as many corners as it can reach. After it gets to a new corner, the previous corner and all of its neighbors must be marked as unusable. The path should never travel to a corner which has been marked unusable.

In computer networking, linear network coding is a program in which intermediate nodes transmit data from source nodes to sink nodes by means of linear combinations.

In coding theory, fountain codes are a class of erasure codes with the property that a potentially limitless sequence of encoding symbols can be generated from a given set of source symbols such that the original source symbols can ideally be recovered from any subset of the encoding symbols of size equal to or only slightly larger than the number of source symbols. The term fountain or rateless refers to the fact that these codes do not exhibit a fixed code rate.

Model selection is the task of selecting a model from among various candidates on the basis of performance criterion to choose the best one. In the context of learning, this may be the selection of a statistical model from a set of candidate models, given data. In the simplest cases, a pre-existing set of data is considered. However, the task can also involve the design of experiments such that the data collected is well-suited to the problem of model selection. Given candidate models of similar predictive or explanatory power, the simplest model is most likely to be the best choice.

<span class="mw-page-title-main">Matching pursuit</span>

Matching pursuit (MP) is a sparse approximation algorithm which finds the "best matching" projections of multidimensional data onto the span of an over-complete dictionary . The basic idea is to approximately represent a signal from Hilbert space as a weighted sum of finitely many functions taken from . An approximation with atoms has the form

Babak Hassibi is an Iranian-American electrical engineer, computer scientist, and applied mathematician who is the inaugural Mose and Lillian S. Bohn Professor of Electrical Engineering and Computing and Mathematical Sciences at the California Institute of Technology (Caltech). From 2011 to 2016 he was the Gordon M Binder/Amgen Professor of Electrical Engineering and during 2008-2015 he was Executive Officer of Electrical Engineering, as well as Associate Director of Information Science and Technology.

<span class="mw-page-title-main">Michael Luby</span> Information theorist and cryptographer

Michael George Luby is a mathematician and computer scientist, CEO of BitRipple, Senior Research Scientist at the International Computer Science Institute (ICSI), former VP Technology at Qualcomm, co-founder and former Chief Technology Officer of Digital Fountain. In coding theory he is known for leading the invention of the Tornado codes and the LT codes. In cryptography he is known for his contributions showing that any one-way function can be used as the basis for private cryptography, and for his analysis, in collaboration with Charles Rackoff, of the Feistel cipher construction. His distributed algorithm to find a maximal independent set in a computer network has also been influential.

Sparse approximation theory deals with sparse solutions for systems of linear equations. Techniques for finding these solutions and exploiting them in applications have found wide use in image processing, signal processing, machine learning, medical imaging, and more.

Quantum block codes are useful in quantum computing and in quantum communications. The encoding circuit for a large block code typically has a high complexity although those for modern codes do have lower complexity.

TriX is a serialization format for RDF graphs. It is an XML format for serializing Named Graphs and RDF Datasets which offers a compact and readable alternative to the XML-based RDF/XML syntax. It was jointly created by HP Labs and Nokia.

A beta encoder is an analog-to-digital conversion (A/D) system in which a real number in the unit interval is represented by a finite representation of a sequence in base beta, with beta being a real number between 1 and 2. Beta encoders are an alternative to traditional approaches to pulse-code modulation.

In the theory of quantum communication, the entanglement-assisted classical capacity of a quantum channel is the highest rate at which classical information can be transmitted from a sender to receiver when they share an unlimited amount of noiseless entanglement. It is given by the quantum mutual information of the channel, which is the input-output quantum mutual information maximized over all pure bipartite quantum states with one system transmitted through the channel. This formula is the natural generalization of Shannon's noisy channel coding theorem, in the sense that this formula is equal to the capacity, and there is no need to regularize it. An additional feature that it shares with Shannon's formula is that a noiseless classical or quantum feedback channel cannot increase the entanglement-assisted classical capacity. The entanglement-assisted classical capacity theorem is proved in two parts: the direct coding theorem and the converse theorem. The direct coding theorem demonstrates that the quantum mutual information of the channel is an achievable rate, by a random coding strategy that is effectively a noisy version of the super-dense coding protocol. The converse theorem demonstrates that this rate is optimal by making use of the strong subadditivity of quantum entropy.

<span class="mw-page-title-main">Katalin Marton</span> Hungarian mathematician (1941–2019)

Katalin Marton was a Hungarian mathematician, born in Budapest.

Directed information is an information theory measure that quantifies the information flow from the random string to the random string . The term directed information was coined by James Massey and is defined as

Nilanjana Datta is an Indian-born British mathematician. She is a Professor in Quantum Information Theory in the Department of Applied Mathematics and Theoretical Physics at the University of Cambridge, and a Fellow of Pembroke College.

Eric Michael Rains is an American mathematician specializing in coding theory and special functions, especially applications from and to noncommutative algebraic geometry.

Mark McMahon Wilde is an American quantum information scientist. He is an Associate Professor in the School of Electrical and Computer Engineering at Cornell University, and he is also a Fields Member in the School of Applied and Engineering Physics and the Department of Computer Science at Cornell.

John Cronan Kieffer is an American mathematician best known for his work in information theory, ergodic theory, and stationary process theory.

References

  1. "ArXiV FAQ". itsoc.org.
  2. van Lint, J.H. (1998). Introduction to Coding Theory (3rd ed.). Springer.
  3. Johan Bollen, Marko A. Rodriquez and Herbert Van de Sompel (December 2006). "Journal status". Scientometrics. 69 (3): 669–687. arXiv: cs.GL/0601030 . doi:10.1007/s11192-006-0176-z. S2CID   8572274.