In theoretical computer science, a pointer machine is an atomistic abstract computational machine whose storage structure is a graph. A pointer algorithm could also be an algorithm restricted to the pointer machine model. [1]
Some particular types of pointer machines are called a linking automaton, a KU-machine, an SMM, an atomistic LISP machine, a tree-pointer machine, etc. [2]
Pointer machines do not have arithmetic instructions. Computation proceeds only by reading input symbols, modifying and doing various tests on its storage structure—the pattern of nodes and pointers, and outputting symbols based on the tests. In this sense, the model is similar to the Turing machine.
Both Gurevich and Ben-Amram list a number of very similar "atomistic" models of "abstract machines"; [3] [2] Ben-Amram believes that the "atomistic models" must be distinguished from "high-level" models. The following atomistic models will be presented below:
Ben-Amram also presents the following varieties, not further discussed in this article:
The following presentation follows van Emde Boas. [6]
The machine consists of a fixed alphabet of input symbols, a fixed program, and a mutable directed graph with its arrows labelled by alphabet symbols. The graph is the machine's storage. Each node of the graph has exactly one outgoing arrow labelled with each symbol, although some of these may loop back into the original node. One fixed node of the graph is identified as the start or "active" node.
Each word of symbols in the alphabet can then be translated to a pathway through the machine; for example, 10011 would translate to taking edge 1 from the start node, then edge 0 from the resulting node, then edge 0, then edge 1, then edge 1. Thus a word identifies a node, the final node of the path, but this identification will change as the graph changes during the computation.
The machine can receive instructions which change the layout of the graph. The basic instructions are:
(1) new w instruction, which creates a new node at the end of the path w, with all its edges directed to the next-to-last node in w. (2) set w to v instruction which (re)directs an edge to a different node. Here w and v represent words. The instruction results in changing the destination of the last edge in the path w.
(3) Ifv = wthen instruction z : Conditional instruction that compares two paths represented by words w and v to see if they end at the same node; if so jump to instruction z else continue. This instruction serves the same purpose as the if command in any imperative programming language.
(4) read and write instructions for input/output, accessing a read-only input tape and a write-only output tape, both containing symbols of the alphabet.
Knuth noted that the SMM model coincides with a type of "linking automaton" briefly explained in volume one of The Art of Computer Programming. [4]
KUM differs from SMM in allowing only invertible pointers: for every pointer from a node x to a node y, an inverse pointer from y to x must be present, labeled by the same symbol. In other words, the storage graph is undirected. Since outgoing pointers must be labeled by distinct symbols of the alphabet, both KUM and SMM graphs have O(1) outdegree. However, KUM pointers' invertibility restricts the in-degree to O(1), as well. This addresses some concerns for physical (as opposite to purely informational) realism.
There are other, minor differences between the models, such as the form of the program - a state table instead of a list of instructions.
Use of the model in complexity theory: van Emde Boas (1990) expresses concern that this form of abstract model is:
Gurevich also expresses concern:
Schönhage demonstrates the real-time equivalences of two types of random-access machine with the SMM. [4]
Algorithms in the SMM model: Schönhage demonstrates that the SMM can perform integer multiplication in linear time. [4]
Potential uses for the model: Gurevich wonders whether or not a parallel KU machine "resembles somewhat the human brain" [8]
Parallel computing: All the models mentioned above are sequential. A parallel (atomistic) pointer machine model has been proposed by Cook and Dymond [9] ; a high-level (non-atomistic) parallel pointer machine model has also been used [10]
Register machine—generic register-based abstract machine computational model
Turing machine—generic tape-based abstract machine computational model
Most references and a bibliography are to be found at the article Register machine. The following are particular to this article:
In mathematics and computer science, an algorithm is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes and deduce valid inferences, achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".
In computability theory, the Church–Turing thesis is a thesis about the nature of computable functions. It states that a function on the natural numbers can be calculated by an effective method if and only if it is computable by a Turing machine. The thesis is named after American mathematician Alonzo Church and the British mathematician Alan Turing. Before the precise definition of computable function, mathematicians often used the informal term effectively calculable to describe functions that are computable by paper-and-pencil methods. In the 1930s, several independent attempts were made to formalize the notion of computability:
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm.
A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algorithm.
In computer science, an abstract machine is a theoretical model that allows for a detailed and precise analysis of how a computer system functions. It is similar to a mathematical function in that it receives inputs and produces outputs based on predefined rules. Abstract machines vary from literal machines in that they are expected to perform correctly and independently of hardware. Abstract machines are "machines" because they allow step-by-step execution of programmes; they are "abstract" because they ignore many aspects of actual (hardware) machines. A typical abstract machine consists of a definition in terms of input, output, and the set of allowable operations used to turn the former into the latter. They can be used for purely theoretical reasons as well as models for real-world computer systems. In the theory of computation, abstract machines are often used in thought experiments regarding computability or to analyse the complexity of algorithms. This use of abstract machines is fundamental to the field of computational complexity theory, such as finite state machines, Mealy machines, push-down automata, and Turing machines.
Hypercomputation or super-Turing computation is a set of models of computation that can provide outputs that are not Turing-computable. For example, a machine that could solve the halting problem would be a hypercomputer; so too would one that can correctly evaluate every statement in Peano arithmetic.
Solomonoff's theory of inductive inference is a mathematical theory of induction introduced by Ray Solomonoff, based on probability theory and theoretical computer science. In essence, Solomonoff's induction derives the posterior probability of any computable theory, given a sequence of observed data. This posterior probability is derived from Bayes' rule and some universal prior, that is, a prior that assigns a positive probability to any computable theory.
In mathematical logic and theoretical computer science, a register machine is a generic class of abstract machines used in a manner similar to a Turing machine. All the models are Turing equivalent.
In computer science, random-access machine (RAM) is an abstract machine in the general class of register machines. The RAM is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. Like the counter machine, The RAM has its instructions in the finite-state portion of the machine.
In computer science, interactive computation is a mathematical model for computation that involves input/output communication with the external world during computation.
Computable functions are the basic objects of study in computability theory. Computable functions are the formalized analogue of the intuitive notion of algorithms, in the sense that a function is computable if there exists an algorithm that can do the job of the function, i.e. given an input of the function domain it can return the corresponding output. Computable functions are used to discuss computability without referring to any concrete model of computation such as Turing machines or register machines. Any definition, however, must make reference to some specific model of computation but all valid definitions yield the same class of functions. Particular models of computability that give rise to the set of computable functions are the Turing-computable functions and the general recursive functions.
In computer science, st-connectivity or STCON is a decision problem asking, for vertices s and t in a directed graph, if t is reachable from s.
A Post–Turing machine is a "program formulation" of a type of Turing machine, comprising a variant of Emil Post's Turing-equivalent model of computation. Post's model and Turing's model, though very similar to one another, were developed independently. Turing's paper was received for publication in May 1936, followed by Post's in October. A Post–Turing machine uses a binary alphabet, an infinite sequence of binary storage locations, and a primitive programming language with instructions for bi-directional movement among the storage locations and alteration of their contents one at a time. The names "Post–Turing program" and "Post–Turing machine" were used by Martin Davis in 1973–1974. Later in 1980, Davis used the name "Turing–Post program".
A Turing machine is a hypothetical computing device, first conceived by Alan Turing in 1936. Turing machines manipulate symbols on a potentially infinite strip of tape according to a finite table of rules, and they provide the theoretical underpinnings for the notion of a computer algorithm.
Algorithm characterizations are attempts to formalize the word algorithm. Algorithm does not have a generally accepted formal definition. Researchers are actively working on this problem. This article will present some of the "characterizations" of the notion of "algorithm" in more detail.
In theoretical computer science the random-access stored-program (RASP) machine model is an abstract machine used for the purposes of algorithm development and algorithm complexity theory.
A counter machine is an abstract machine used in a formal logic and theoretical computer science to model computation. It is the most primitive of the four types of register machines. A counter machine comprises a set of one or more unbounded registers, each of which can hold a single non-negative integer, and a list of arithmetic and control instructions for the machine to follow. The counter machine is typically used in the process of designing parallel algorithms in relation to the mutual exclusion principle. When used in this manner, the counter machine is used to model the discrete time-steps of a computational system in relation to memory accesses. By modeling computations in relation to the memory accesses for each respective computational step, parallel algorithms may be designed in such a matter to avoid interlocking, the simultaneous writing operation by two threads to the same memory address.
There are many variants of the counter machine, among them those of Hermes, Ershov, Péter, Minsky, Lambek, Shepherdson and Sturgis, and Schönhage. These are explained below.
Yuri Gurevich, Professor Emeritus at the University of Michigan, is an American computer scientist and mathematician and the inventor of abstract state machines.