John Kieffer

Last updated

John Cronan Kieffer (born 1945) is an American mathematician best known for his work in information theory, ergodic theory, and stationary process theory.

Contents

Education

Kieffer received his elementary and high school education in St Louis, Missouri, a bachelor's degree in applied mathematics in 1967 from University of Missouri Rolla, and a master's degree in mathematics in 1968 from University of Illinois Urbana-Champaign. In 1970, under Robert B. Ash, he received the Ph.D. degree in mathematics from University of Illinois Urbana-Champaign with thesis A Generalization of the Shannon-McMillan Theorem and Its Application to Information Theory. [1] [2]

Work history

In 1970 Kieffer became an assistant professor at Missouri University of Science and Technology, where he eventually became a full professor. [3] In 1986 he became a full professor at University of Minnesota Twin Cities. [4] Kieffer held visiting appointments at Stanford University, University of Illinois Urbana-Champaign, ETH Zürich, and University of Arizona. He has been the supervisor for 6 Ph.D. theses. [1]

Professional activities

During the 1980s, Kieffer was Associate Editor of the IEEE Transactions on Information Theory. [5] In 2004, Kieffer was co-editor of a special issue of the IEEE Transactions on Information Theory entitled "Problems on Sequences: Information Theory and Computer Science Interface". [6] He is a Life Fellow of the Institute of Electrical and Electronics Engineers "for contributions to information theory, particularly coding theory and quantization". [7]

Key works

1. Key works on grammar-based coding:

2. Key works on channel coding:

3. Key works on quantization:

4. Key works on ergodic theory:

5. Key works on stationary process theory:

Inventions

Impact

Kieffer has over 70 journal publications in the mathematical sciences. [11] His research work has attracted over 3000 Google Scholar citations, [12] over 500 MathSciNet citations [13] and over 1000 IEEE Xplore citations. [3] Some of these works have been cited as prior art on various United States patents. [14] In 1998, the IEEE Transactions on Information Theory published a special issue consisting of articles that survey research in information theory during 1948–1998. Two of these articles include discussions of Kieffer's work, namely, the article Lossy Source Coding [15] by Toby Berger and Jerry Gibson, and the article Quantization [16] by Robert M. Gray and David Neuhoff. In addition, the textbook Transmitting and Gaining Data [17] by Rudolf Ahlswede presents several aspects of Kieffer's work.

Related Research Articles

In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Typically, a device that performs data compression is referred to as an encoder, and one that performs the reversal of the process (decompression) as a decoder.

<span class="mw-page-title-main">Image compression</span> Reduction of image size to save storage and transmission costs

Image compression is a type of data compression applied to digital images, to reduce their cost for storage or transmission. Algorithms may take advantage of visual perception and the statistical properties of image data to provide superior results compared with generic data compression methods which are used for other digital data.

<span class="mw-page-title-main">Quantization (signal processing)</span> Process of mapping a continuous set to a countable set

Quantization, in mathematics and digital signal processing, is the process of mapping input values from a large set to output values in a (countable) smaller set, often with a finite number of elements. Rounding and truncation are typical examples of quantization processes. Quantization is involved to some degree in nearly all digital signal processing, as the process of representing a signal in digital form ordinarily involves rounding. Quantization also forms the core of essentially all lossy compression algorithms.

A cryptosystem is considered to have information-theoretic security if the system is secure against adversaries with unlimited computing resources and time. In contrast, a system which depends on the computational cost of cryptanalysis to be secure is called computationally, or conditionally, secure.

<span class="mw-page-title-main">Snake-in-the-box</span>

The snake-in-the-box problem in graph theory and computer science deals with finding a certain kind of path along the edges of a hypercube. This path starts at one corner and travels along the edges to as many corners as it can reach. After it gets to a new corner, the previous corner and all of its neighbors must be marked as unusable. The path should never travel to a corner which has been marked unusable.

<span class="mw-page-title-main">Abraham Lempel</span> Israeli computer scientist (1936–2023)

Abraham Lempel was an Israeli computer scientist and one of the fathers of the LZ family of lossless data compression algorithms.

Grammar-based codes or Grammar-based compression are compression algorithms based on the idea of constructing a context-free grammar (CFG) for the string to be compressed. Examples include universal lossless data compression algorithms. To compress a data sequence , a grammar-based code transforms into a context-free grammar . The problem of finding a smallest grammar for an input sequence is known to be NP-hard, so many grammar-transform algorithms are proposed from theoretical and practical viewpoints. Generally, the produced grammar is further compressed by statistical encoders like arithmetic coding.

Babak Hassibi is an Iranian-American electrical engineer, computer scientist, and applied mathematician who is the inaugural Mose and Lillian S. Bohn Professor of Electrical Engineering and Computing and Mathematical Sciences at the California Institute of Technology (Caltech). From 2011 to 2016 he was the Gordon M Binder/Amgen Professor of Electrical Engineering. During 2008-2015 he was the Executive Officer of Electrical Engineering and Associate Director of Information Science and Technology.

Andreas J. Winter is a German mathematician and mathematical physicist at the Universitat Autònoma de Barcelona (UAB) in Spain. He received his Ph.D. in 1999 under Rudolf Ahlswede and Friedrich Götze at the Universität Bielefeld in Germany before moving to the University of Bristol and then to the Centre for Quantum Technologies (CQT) at the National University of Singapore. In 2013 he was appointed ICREA Research Professor at UAB.

<span class="mw-page-title-main">Rudolf Ahlswede</span> German mathematician

Rudolf F. Ahlswede was a German mathematician. Born in Dielmissen, Germany, he studied mathematics, physics, and philosophy. He wrote his Ph.D. thesis in 1966, at the University of Göttingen, with the topic "Contributions to the Shannon information theory in case of non-stationary channels". He dedicated himself in his further career to information theory and became one of the leading representatives of this area worldwide.

A beta encoder is an analog-to-digital conversion (A/D) system in which a real number in the unit interval is represented by a finite representation of a sequence in base beta, with beta being a real number between 1 and 2. Beta encoders are an alternative to traditional approaches to pulse-code modulation.

In information theory and communication, the Slepian–Wolf coding, also known as the Slepian–Wolf bound, is a result in distributed source coding discovered by David Slepian and Jack Wolf in 1973. It is a method of theoretically coding two lossless compressed correlated sources.

<span class="mw-page-title-main">Yasuo Matsuyama</span>

Yasuo Matsuyama is a Japanese researcher in machine learning and human-aware information processing.

In mathematics, Ingleton's inequality is an inequality that is satisfied by the rank function of any representable matroid. In this sense it is a necessary condition for representability of a matroid over a finite field. Let M be a matroid and let ρ be its rank function, Ingleton's inequality states that for any subsets X1, X2, X3 and X4 in the support of M, the inequality

<span class="mw-page-title-main">Audio coding format</span> Digitally coded format for audio signals

An audio coding format is a content representation format for storage or transmission of digital audio. Examples of audio coding formats include MP3, AAC, Vorbis, FLAC, and Opus. A specific software or hardware implementation capable of audio compression and decompression to/from a specific audio coding format is called an audio codec; an example of an audio codec is LAME, which is one of several different codecs which implements encoding and decoding audio in the MP3 audio coding format in software.

<span class="mw-page-title-main">Katalin Marton</span> Hungarian mathematician (1941–2019)

Katalin Marton was a Hungarian mathematician, born in Budapest.

Nilanjana Datta is an Indian-born British mathematician. She is a Professor in Quantum Information Theory in the Department of Applied Mathematics and Theoretical Physics at the University of Cambridge, and a Fellow of Pembroke College.

Mark McMahon Wilde is an American quantum information scientist. He is an Associate Professor in the School of Electrical and Computer Engineering at Cornell University, and he is also a Fields Member in the School of Applied and Engineering Physics and the Department of Computer Science at Cornell.

Can Emre Koksal is an electrical engineer, computer scientist, academic, and entrepreneur. He is the Founder and CEO of Datanchor, and a professor of Electrical and Computer Engineering at Ohio State University.

Mikael Skoglund is an academic born 1969 in Kungälv, Sweden. He is a professor of Communication theory, and the Head of the Division of Information Science and Engineering of the Department of Intelligent Systems at KTH Royal Institute of Technology. His research focuses on source-channel coding, signal processing, information theory, privacy, security, and with a particular focus on how information theory applies to wireless communications.

References

  1. 1 2 John Kieffer at the Mathematics Genealogy Project
  2. Kieffer, John Cronan (1970). John Kieffer Ph.D. thesis (Thesis). University of Illinois. Retrieved August 22, 2022.
  3. 1 2 "John C. Kieffer Biography". IEEE Xplore. Retrieved August 21, 2022.
  4. "John Kieffer Emeritus Professor". University of Minnesota. Retrieved August 22, 2022.
  5. "John Kieffer Associate Editor". IEEE Information Theory Society. Retrieved August 22, 2022.
  6. Kieffer, J.C.; Szpankowski, W.; Yang, E.-H. (2004). "Problems on Sequences: Information Theory and Computer Science Interface". IEEE Transactions on Information Theory. 50 (7). IEEE Xplore: 1385–1392. doi:10.1109/TIT.2004.830747 . Retrieved August 22, 2022.
  7. "John Kieffer Life Fellow". IEEE. Retrieved August 25, 2022.
  8. Kieffer, J. C.; Yang, E.-H.; Nelson, G.; Cosman, P. (2000), "Universal lossless compression via multilevel pattern matching", IEEE Trans. Inf. Theory, 46 (4): 1227–1245, doi:10.1109/18.850665, S2CID   8191526
  9. Charikar, M.; Lehman, E.; Liu, D.; Panigrahy, R.; Prabharakan, M.; Sahai, A.; Shelat, A. (2005), "The Smallest Grammar Problem", IEEE Trans. Inf. Theory, 51 (7): 2554–2576, doi:10.1109/tit.2005.850116, S2CID   6900082
  10. Bannai, H. (2016), "Grammar Compression", Encyclopedia of Algorithms, Springer New York, pp. 861–866, doi:10.1007/978-1-4939-2864-4_635, ISBN   978-1-4939-2863-7
  11. "John Kieffer Journal Publication List". University of Minnesota. Retrieved August 25, 2022.
  12. John Kieffer publications indexed by Google Scholar
  13. "John Kieffer MathSciNet Citations". American Mathematical Society. Retrieved August 22, 2022.
  14. Yang, En-Hui; Kieffer, J. C. (May 2000). "Patents Citing Kieffer's Work". IEEE Transactions on Information Theory. 46 (3). IEEE Xplore: 755–777. doi:10.1109/18.841161 . Retrieved August 21, 2022.
  15. Berger, T.; Gibson, J. D. (1998), "Lossy source coding", IEEE Transactions on Information Theory, 44 (6): 2693–2723, doi:10.1109/18.720552
  16. Gray, R. M.; Neuhoff, D. L. (1998), "Quantization", IEEE Transactions on Information Theory, 44 (6): 2325–2383, doi:10.1109/18.720541, S2CID   212653679
  17. Ahlswede, R. (2015), Transmitting and Gaining Data, Foundations in Signal Processing, Communications and Networking, vol. 11, Springer International Publishing, doi:10.1007/978-3-319-12523-7, ISBN   978-3-319-12522-0, S2CID   124806197