Speed prior

Last updated

The speed prior is a complexity measure similar to Kolmogorov complexity, except that it is based on computation speed as well as program length. [1] The speed prior complexity of a program is its size in bits plus the logarithm of the maximum time we are willing to run it to get a prediction.

Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, meaning there is no reasonable higher instruction to define the various possible interactions.

Kolmogorov complexity

In algorithmic information theory, the Kolmogorov complexity of an object, such as a piece of text, is the length of the shortest computer program that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as descriptive complexity, Kolmogorov–Chaitin complexity, algorithmic complexity, algorithmic entropy, or program-size complexity. It is named after Andrey Kolmogorov, who first published on the subject in 1963.

Computation is any type of calculation that includes both arithmetical and non-arithmetical steps and follows a well-defined model, for example an algorithm.

Contents

When compared to traditional measures, use of the Speed Prior has the disadvantage of leading to less optimal predictions, and the advantage of providing computable predictions.

See also

Computational complexity theory focuses on classifying computational problems according to their inherent difficulty, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm.

Minimum message length (MML) is a formal information theory restatement of Occam's Razor: even when models are equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct. MML was invented by Chris Wallace, first appearing in the seminal paper "An information measure for classification" Wallace & Boulton (1968).

The minimum description length (MDL) principle is a formalization of Occam's razor in which the best hypothesis for a given set of data is the one that leads to the best compression of the data. MDL was introduced by Jorma Rissanen in 1978. It is an important concept in information theory and computational learning theory.

Related Research Articles

Algorithm an unambiguous specification of how to solve a class of problems

In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems. Algorithms can perform calculation, data processing, and automated reasoning tasks.

Theoretical computer science subfield of computer science and of mathematics

Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on more mathematical topics of computing and includes the theory of computation.

In algorithmic information theory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in inductive inference theory and analyses of algorithms. In his general theory of inductive inference, Solomonoff uses the prior obtained by this formula, in Bayes' rule for prediction.

In physics and cosmology, digital physics is a collection of theoretical perspectives based on the premise that the universe is describable by information. It is a form of digital ontology about the physical reality. According to this theory, the universe can be conceived of as either the output of a deterministic or probabilistic computer program, a vast, digital computation device, or mathematically isomorphic to such a device.

Ray Solomonoff's theory of universal inductive inference is a theory of prediction based on logical observations, such as predicting the next symbol based upon a given series of symbols. The only assumption that the theory makes is that the environment follows some unknown but computable probability distribution. It is a mathematical formalization of Occam's razor and the Principle of Multiple Explanations.

Combinatorial optimization subset of mathematical optimization

In Operations Research, applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects. In many such problems, exhaustive search is not tractable. It operates on the domain of those optimization problems, in which the set of feasible solutions is discrete or can be reduced to discrete, and in which the goal is to find the best solution. Some common problems involving combinatorial optimization are the travelling salesman problem ("TSP") and the minimum spanning tree problem ("MST").

Matrix chain multiplication is an optimization problem that can be solved using dynamic programming. Given a sequence of matrices, the goal is to find the most efficient way to multiply these matrices. The problem is not actually to perform the multiplications, but merely to decide the sequence of the matrix multiplications involved.

In computational complexity theory, Blum's speedup theorem, first stated by Manuel Blum in 1967, is a fundamental theorem about the complexity of computable functions.

Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."

Klees measure problem problem in computational geometry

In computational geometry, Klee's measure problem is the problem of determining how efficiently the measure of a union of (multidimensional) rectangular ranges can be computed. Here, a d-dimensional rectangular range is defined to be a Cartesian product of d intervals of real numbers, which is a subset of Rd.

In information theory and computer science, the Damerau–Levenshtein distance is a string metric for measuring the edit distance between two sequences. Informally, the Damerau–Levenshtein distance between two words is the minimum number of operations required to change one word into the other.

Active networking is a communication pattern that allows packets flowing through a telecommunications network to dynamically modify the operation of the network.

In computational complexity theory, a computational resource is a resource used by some computational models in the solution of computational problems.

In computational complexity theory, the element distinctness problem or element uniqueness problem is the problem of determining whether all the elements of a list are distinct.

Christopher Cherniak is an American neuroscientist, a member of the University of Maryland Philosophy Department. Cherniak’s research trajectory started in theory of knowledge and led into computational neuroanatomy and genomics. The underlying linkage between the areas concerns models of the agent: The work began with more realistic, bounded-resource models of rationality. From this epistemology in turn stemmed a research program concerning optimal-wiring models of global brain and genome anatomy, a structuralist approach.

Lance Jeremy Fortnow is a computer scientist known for major results in computational complexity and interactive proof systems. He currently chairs the School of Computer Science at Georgia Tech.

The rectilinear Steiner tree problem, minimum rectilinear Steiner tree problem (MRST), or rectilinear Steiner minimum tree problem (RSMT) is a variant of the geometric Steiner tree problem in the plane, in which the Euclidean distance is replaced with the rectilinear distance. The problem may be formally stated as follows: given n points in the plane, it is required to interconnect them all by a shortest network which consists only of vertical and horizontal line segments. It can be shown that such a network is a tree whose vertices are the input points plus some extra points.

References

  1. Schmidhuber, J. (2002) The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions. In J. Kivinen and R. H. Sloan, editors, Proceedings of the 15th Annual Conference on Computational Learning Theory (COLT 2002). Lecture Notes in Artificial Intelligence, pages 216--228. Springer.