Chris Wallace (computer scientist)

Last updated

For other uses, see Chris Wallace (disambiguation).

Christopher Stewart Wallace (26 October 1933 – 7 August 2004) was an Australian computer scientist and physicist.

Wallace is notable for having devised:

He was appointed Foundation Chair of Information Science at Monash University in 1968 at the age of 34 (before the Department was re-named Computer Science), and Professor Emeritus in 1996. Wallace was a fellow of the Australian Computer Society and in 1995 he was appointed a fellow of the ACM "For research in a number of areas in Computer Science including fast multiplication algorithm, minimum message length principle and its applications, random number generation, computer architecture, numerical solution of ODE's, and contribution to Australian Computer Science." [4]

Wallace received his PhD (in Physics) from the University of Sydney in 1959. He was married to Judy Ogilvie, the first secretary and programme librarian of SILLIAC, which was launched on the 12 of September 1956 at the University of Sydney and which was one of Australia's first computers. [5] He also engineered one of the world's first Local Area Networks in the mid-1960s. [6]

Related Research Articles

<span class="mw-page-title-main">Kolmogorov complexity</span> Measure of algorithmic complexity

In algorithmic information theory, the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy. It is named after Andrey Kolmogorov, who first published on the subject in 1963 and is a generalization of classical information theory.

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

<span class="mw-page-title-main">Statistical inference</span> Process of using data analysis

Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.

In philosophy, Occam's razor is the problem-solving principle that recommends searching for explanations constructed with the smallest possible set of elements. It is also known as the principle of parsimony or the law of parsimony. Attributed to William of Ockham, a 14th-century English philosopher and theologian, it is frequently cited as Entia non sunt multiplicanda praeter necessitatem, which translates as "Entities must not be multiplied beyond necessity", although Occam never used these exact words. Popularly, the principle is sometimes paraphrased as "The simplest explanation is usually the best one."

Hypercomputation or super-Turing computation is a set of hypothetical models of computation that can provide outputs that are not Turing-computable. For example, a machine that could solve the halting problem would be a hypercomputer; so too would one that could correctly evaluate every statement in Peano arithmetic.

Minimum message length (MML) is a Bayesian information-theoretic method for statistical model comparison and selection. It provides a formal information theory restatement of Occam's Razor: even when models are equal in their measure of fit-accuracy to the observed data, the one generating the most concise explanation of data is more likely to be correct. MML was invented by Chris Wallace, first appearing in the seminal paper "An information measure for classification". MML is intended not just as a theoretical construct, but as a technique that may be deployed in practice. It differs from the related concept of Kolmogorov complexity in that it does not require use of a Turing-complete language to model data.

<span class="mw-page-title-main">Manuel Blum</span> Venezuelan computer scientist

Manuel Blum is a Venezuelan born American computer scientist who received the Turing Award in 1995 "In recognition of his contributions to the foundations of computational complexity theory and its application to cryptography and program checking".

Minimum Description Length (MDL) is a model selection principle where the shortest description of the data is the best model. MDL methods learn through a data compression perspective and are sometimes described as mathematical applications of Occam's razor. The MDL principle can be extended to other forms of inductive inference and learning, for example to estimation and sequential prediction, without explicitly identifying a single model of the data.

The term Inductive reasoning is used to refer to any method of reasoning in which broad generalizations or principles are derived from a body of observations. This article is concerned with the inductive reasoning other than deductive reasoning, where the conclusion of a deductive argument is certain given the premises are correct; in contrast, the truth of the conclusion of an inductive argument is at best probable, based upon the evidence given.

Ray Solomonoff was an American mathematician who invented algorithmic probability, his General Theory of Inductive Inference, and was a founder of algorithmic information theory. He was an originator of the branch of artificial intelligence based on machine learning, prediction and probability. He circulated the first report on non-semantic machine learning in 1956.

<span class="mw-page-title-main">Algorithmic probability</span>

In algorithmic information theory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in inductive inference theory and analyses of algorithms. In his general theory of inductive inference, Solomonoff uses the method together with Bayes' rule to obtain probabilities of prediction for an algorithm's future outputs.

Solomonoff's theory of inductive inference is a mathematical theory of induction introduced by Ray Solomonoff, based on probability theory and theoretical computer science. In essence, Solomonoff's induction derives the posterior probability of any computable theory, given a sequence of observed data. This posterior probability is derived from Bayes' rule and some universal prior, that is, a prior that assigns a positive probability to any computable theory.

<span class="mw-page-title-main">Christopher Strachey</span> British computer scientist (1916–1975)

Christopher S. Strachey was a British computer scientist. He was one of the founders of denotational semantics, and a pioneer in programming language design and computer time-sharing. He has also been credited as possibly being the first developer of a video game. He was a member of the Strachey family, prominent in government, arts, administration, and academia.

Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."

The natural unit of information, sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.

<span class="mw-page-title-main">Postmodernism Generator</span> Computer program

The Postmodernism Generator is a computer program that automatically produces "close imitations" of postmodernist writing. It was written in 1996 by Andrew C. Bulhak of Monash University using the Dada Engine, a system for generating random text from recursive grammars. A free version is also hosted online. The essays are produced from a formal grammar defined by a recursive transition network.

The free energy principle is a theoretical framework suggesting that the brain reduces surprise or uncertainty by making predictions based on internal models and updating them using sensory input. It highlights the brain's objective of aligning its internal model with the external world to enhance prediction accuracy. This principle integrates Bayesian inference with active inference, where actions are guided by predictions and sensory feedback refines them. It has wide-ranging implications for comprehending brain function, perception, and action.

Inductive probability attempts to give the probability of future events based on past events. It is the basis for inductive reasoning, and gives the mathematical basis for learning and the perception of patterns. It is a source of knowledge about the world.

Universality probability is an abstruse probability measure in computational complexity theory that concerns universal Turing machines.

References

  1. Wallace, C. S.; Boulton, D. M. (August 1968). "An Information Measure for Classification". The Computer Journal. 11 (2): 185–194. doi: 10.1093/comjnl/11.2.185 .
  2. Brent, Richard P. (September 2008). "Some Comments on C. S. Wallace's Random Number Generators" (PDF). The Computer Journal. 51 (5): 579–584. arXiv: 1005.2314 . doi:10.1093/comjnl/bxm122. S2CID   16287927.
  3. "MDMC Software - Random Number Generators". Monash Data Mining Centre. Archived from the original on 22 July 2013.
  4. "ACM Fellows: Chris S Wallace". Association for Computing Machinery. Archived from the original on 19 August 2006.
  5. Jacobs, Marie (Winter 2006). "Love at First Byte". Sydney Alumni Magazine. University of Sydney. Archived from the original on 25 August 2006.
  6. Bennett, John M. (Spring 1998). "Letters to the Editor". Resurrection: The Bulletin of the Computer Conservation Society (19).