The following are examples to supplement the article Turing machine.
The following table is Turing's very first example (Alan Turing 1937):
Configuration | Behavior | ||
---|---|---|---|
m-configuration (state) | Tape symbol | Tape operations | Final m-configuration (state) |
b | blank | P0, R | c |
c | blank | R | e |
e | blank | P1, R | f |
f | blank | R | b |
With regard to what actions the machine actually does, Turing (1936) (Undecidable p. 121) states the following:
He makes this very clear when he reduces the above table to a single instruction called "b" (Undecidable p. 120), but his instruction consists of 3 lines. Instruction "b" has three different symbol possibilities {None, 0, 1}. Each possibility is followed by a sequence of actions until we arrive at the rightmost column, where the final m-configuration is "b":
Current m-configuration (instruction) | Tape symbol | Operations on the tape | Final m-configuration (instruction) |
---|---|---|---|
b | None | P0 | b |
b | 0 | R, R, P1 | b |
b | 1 | R, R, P0 | b |
As observed by a number of commentators including Turing (1937) himself, (e.g., Post (1936), Post (1947), Kleene (1952), Wang (1954)) the Turing instructions are not atomic — further simplifications of the model can be made without reducing its computational power; see more at Post–Turing machine.
As stated in the article Turing machine, Turing proposed that his table be further atomized by allowing only a single print/erase followed by a single tape movement L/R/N. He gives us this example of the first little table converted (Undecidable, p. 127):
Current m-configuration (Turing state) | Tape symbol | Print-operation | Tape-motion | Final m-configuration (Turing state) |
---|---|---|---|---|
q1 | blank | P0 | R | q2 |
q2 | blank | P blank, i.e. E | R | q3 |
q3 | blank | P1 | R | q4 |
q4 | blank | P blank, i.e. E | R | q1 |
Turing's statement still implies five atomic operations. At a given instruction (m-configuration) the machine:
Because a Turing machine's actions are not atomic, a simulation of the machine must atomize each 5-tuple into a sequence of simpler actions. One possibility — used in the following examples of "behaviors" of his machine — is as follows:
So-called "canonical" finite state machines do the symbol tests "in parallel"; see more at microprogramming.
In the following example of what the machine does, we will note some peculiarities of Turing's models:
Thus when printing he skips every other square. The printed-on squares are called F-squares; the blank squares in between may be used for "markers" and are called "E-squares" as in "liable to erasure." The F-squares in turn are his "Figure squares" and will only bear the symbols 1 or 0 — symbols he called "figures" (as in "binary numbers").
In this example the tape starts out "blank", and the "figures" are then printed on it. For brevity only the TABLE-states are shown here:
Sequence | Instruction identifier | Head | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
. | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . | ||
1 | 1 | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . | . |
2 | 2 | . | . | . | . | . | 0 | . | . | . | . | . | . | . | . | . | . | . | . |
3 | 3 | . | . | . | . | . | . | 0 | . | . | . | . | . | . | . | . | . | . | . |
4 | 4 | . | . | . | . | . | 1 | . | 0 | . | . | . | . | . | . | . | . | . | . |
5 | 1 | . | . | . | . | . | . | 1 | . | 0 | . | . | . | . | . | . | . | . | . |
6 | 2 | . | . | . | . | . | 0 | . | 1 | . | 0 | . | . | . | . | . | . | . | . |
7 | 3 | . | . | . | . | . | . | 0 | . | 1 | . | 0 | . | . | . | . | . | . | . |
8 | 4 | . | . | . | . | . | 1 | . | 0 | . | 1 | . | 0 | . | . | . | . | . | . |
9 | 1 | . | . | . | . | . | . | 1 | . | 0 | . | 1 | . | 0 | . | . | . | . | . |
10 | 2 | . | . | . | . | . | 0 | . | 1 | . | 0 | . | 1 | . | 0 | . | . | . | . |
11 | 3 | . | . | . | . | . | . | 0 | . | 1 | . | 0 | . | 1 | . | 0 | . | . | . |
12 | 4 | . | . | . | . | . | 1 | . | 0 | . | 1 | . | 0 | . | 1 | . | 0 | . | . |
13 | 1 | . | . | . | . | . | . | 1 | . | 0 | . | 1 | . | 0 | . | 1 | . | 0 | . |
14 | 2 | . | . | . | . | . | 0 | . | 1 | . | 0 | . | 1 | . | 0 | . | 1 | . | 0 |
The same "run" with all the intermediate tape-printing and movements is shown here:
A close look at the table reveals certain problems with Turing's own example—not all the symbols are accounted for.
For example, suppose his tape was not initially blank. What would happen? The Turing machine would read different values than the intended values.
This is a very important subroutine used in the "multiply" routine.
The example Turing machine handles a string of 0s and 1s, with 0 represented by the blank symbol. Its task is to double any series of 1s encountered on the tape by writing a 0 between them. For example, when the head reads "111", it will write a 0, then "111". The output will be "1110111".
In order to accomplish its task, this Turing machine will need only 5 states of operation, which are called {s1, s2, s3, s4, s5}. Each state does 4 actions:
Initial m-configuration (current instruction) | Tape symbol | Print operation | Tape motion | Final m-configuration (next instruction) |
---|---|---|---|---|
s1 | 0 | N | N | H |
s1 | 1 | E | R | s2 |
s2 | 0 | E | R | s3 |
s2 | 1 | P1 | R | s2 |
s3 | 0 | P1 | L | s4 |
s3 | 1 | P1 | R | s3 |
s4 | 0 | E | L | s5 |
s4 | 1 | P1 | L | s4 |
s5 | 0 | P1 | R | s1 |
s5 | 1 | P1 | L | s5 |
H | — | — | — |
Print Operation: Prints symbol S or Erases or does Nothing
A "run" of the machine sequences through 16 machine-configurations (aka Turing states):
Sequence | Instruction identifier | Head | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | s1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 |
2 | s2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
3 | s2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 |
4 | s3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 |
5 | s4 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 |
6 | s5 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
7 | s5 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
8 | s1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 |
9 | s2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
10 | s3 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 |
11 | s3 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 |
12 | s4 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 |
13 | s4 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
14 | s5 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 |
15 | s1 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 |
16 | H | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 |
The behavior of this machine can be described as a loop: it starts out in s1, replaces the first 1 with a 0, then uses s2 to move to the right, skipping over 1s and the first 0 encountered. s3 then skips over the next sequence of 1s (initially there are none) and replaces the first 0 it finds with a 1. s4 moves back to the left, skipping over 1s until it finds a 0 and switches to s5. s5 then moves to the left, skipping over 1s until it finds the 0 that was originally written by s1.
It replaces that 0 with a 1, moves one position to the right and enters s1 again for another round of the loop.
This continues until s1 finds a 0 (this is the 0 in the middle of the two strings of 1s) at which time the machine halts.
Another description sees the problem as how to keep track of how many "1"s there are. We can't use one state for each possible number (a state for each of 0,1,2,3,4,5,6 etc), because then we'd need infinite states to represent all the natural numbers, and the state machine is finite - we'll have to track this using the tape in some way.
The basic way it works is by copying each "1" to the other side, by moving back and forth - it is intelligent enough to remember which part of the trip it is on. In more detail, it carries each "1" across to the other side, by recognizing the separating "0" in the middle, and recognizing the "0" on the other side to know it's reached the end. It comes back using the same method, detecting the middle "0", and then the "0" on the original side. This "0" on the original side is the key to the puzzle of how it keeps track of the number of 1's.
The trick is that before carrying the "1", it marks that digit as "taken" by replacing it with an "0". When it returns, it fills that "0" back in with a "1", then moves on to the next one, marking it with an "0" and repeating the cycle, carrying that "1" across and so on. With each trip across and back, the marker "0" moves one step closer to the centre. This is how it keeps track of how many "1"'s it has taken across.
When it returns, the marker "0" looks like the end of the collection of "1"s to it - any "1"s that have already been taken across are invisible to it (on the other side of the marker "0") and so it is as if it is working on an (N-1) number of "1"s - similar to a proof by mathematical induction.
A full "run" showing the results of the intermediate "motions". To see it better click on the image, then click on the higher resolution download:
The following Turing table of instructions was derived from Peterson (1988) page 198, Figure 7.15. Peterson moves the head; in the following model the tape moves.
Tape symbol | Current state A | Current state B | Current state C | ||||||
---|---|---|---|---|---|---|---|---|---|
Write symbol | Move tape | Next state | Write symbol | Move tape | Next state | Write symbol | Move tape | Next state | |
0 | 1 | R | B | 1 | L | A | 1 | L | B |
1 | 1 | L | C | 1 | R | B | 1 | N | HALT |
The "state" drawing of the 3-state busy beaver shows the internal sequences of events required to actually perform "the state". As noted above Turing (1937) makes it perfectly clear that this is the proper interpretation of the 5-tuples that describe the instruction (Undecidable, p. 119). For more about the atomization of Turing 5-tuples see Post–Turing machine:
The following table shows the "compressed" run — just the Turing states:
Sequence | Instruction identifier | Head | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | b | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
2 | B | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
3 | A | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
4 | C | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | B | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
6 | A | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
7 | B | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
8 | B | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 |
9 | B | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 |
10 | B | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 |
11 | B | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 |
12 | A | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 |
13 | C | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 |
14 | H | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 |
The full "run" of the 3-state busy beaver. The resulting Turing-states (what Turing called the "m-configurations" — "machine-configurations") are shown highlighted in grey in column A, and also under the machine's instructions (columns AF-AU)):
In mathematics and computer science, an algorithm is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes and deduce valid inferences, achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".
In complexity theory and computability theory, an oracle machine is an abstract machine used to study decision problems. It can be visualized as a Turing machine with a black box, called an oracle, which is able to solve certain problems in a single operation. The problem can be of any complexity class. Even undecidable problems, such as the halting problem, can be used.
A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algorithm.
The Post correspondence problem is an undecidable decision problem that was introduced by Emil Post in 1946. Because it is simpler than the halting problem and the Entscheidungsproblem it is often used in proofs of undecidability.
In theoretical computer science, the busy beaver game aims at finding a terminating program of a given size that produces the most output possible. Since an endlessly looping program producing infinite output or running for infinite time is easily conceived, such programs are excluded from the game. This is sometimes also formulated as finding the machine that runs for the longest time, but both games are similarly difficult.
In computer science, a universal Turing machine (UTM) is a Turing machine capable of computing any computable sequence, as described by Alan Turing in his seminal paper "On Computable Numbers, with an Application to the Entscheidungsproblem". Common sense might say that a universal machine is impossible, but Turing proves that it is possible. He suggested that we may compare a man in the process of computing a real number to a machine which is only capable of a finite number of conditions q 1: q 2. .... qI; which will be called "m-configurations". He then described the operation of such machine, as described below, and argued:
It is my contention that these operations include all those which are used in the computation of a number.
Computability is the ability to solve a problem in an effective manner. It is a key topic of the field of computability theory within mathematical logic and the theory of computation within computer science. The computability of a problem is closely linked to the existence of an algorithm to solve the problem.
In mathematical logic and theoretical computer science, a register machine is a generic class of abstract machines used in a manner similar to a Turing machine. All models of register machines are Turing equivalent.
In computer science, random-access machine is a model of computation that describes an abstract machine in the general class of register machines. The RA-machine is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. The 'registers' are intuitively equivalent to main memory of a common computer, except for the additional ability of registers to store natural numbers of any size. Like the counter machine, the RA-machine contains the execution instructions in the finite-state portion of the machine.
A Post–Turing machine is a "program formulation" of a type of Turing machine, comprising a variant of Emil Post's Turing-equivalent model of computation. Post's model and Turing's model, though very similar to one another, were developed independently. Turing's paper was received for publication in May 1936, followed by Post's in October. A Post–Turing machine uses a binary alphabet, an infinite sequence of binary storage locations, and a primitive programming language with instructions for bi-directional movement among the storage locations and alteration of their contents one at a time. The names "Post–Turing program" and "Post–Turing machine" were used by Martin Davis in 1973–1974. Later in 1980, Davis used the name "Turing–Post program".
Turing's proof is a proof by Alan Turing, first published in November 1936 with the title "On Computable Numbers, with an Application to the Entscheidungsproblem". It was the second proof of the negation of Hilbert's Entscheidungsproblem; that is, the conjecture that some purely mathematical yes–no questions can never be answered by computation; more technically, that some decision problems are "undecidable" in the sense that there is no single algorithm that infallibly gives a correct "yes" or "no" answer to each instance of the problem. In Turing's own words: "what I shall prove is quite different from the well-known results of Gödel ... I shall now show that there is no general method which tells whether a given formula U is provable in K [Principia Mathematica]".
A Turing machine is a hypothetical computing device, first conceived by Alan Turing in 1936. Turing machines manipulate symbols on a potentially infinite strip of tape according to a finite table of rules, and they provide the theoretical underpinnings for the notion of a computer algorithm.
The following article is a supplement to the article Turing machine.
As presented by Hao Wang, his basic machine B is an extremely simple computational model equivalent to the Turing machine. It is "the first formulation of a Turing-machine theory in terms of computer-like models". With only 4 sequential instructions it is very similar to, but even simpler than, the 7 sequential instructions of the Post–Turing machine. In the same paper, Wang introduced a variety of equivalent machines, including what he called the W-machine, which is the B-machine with an "erase" instruction added to the instruction set.
Algorithm characterizations are attempts to formalize the word algorithm. Algorithm does not have a generally accepted formal definition. Researchers are actively working on this problem. This article will present some of the "characterizations" of the notion of "algorithm" in more detail.
There are many variants of the counter machine, among them those of Hermes, Ershov, Péter, Minsky, Lambek, Shepherdson and Sturgis, and Schönhage. These are explained below.
In computer science, a computation history is a sequence of steps taken by an abstract machine in the process of computing its result. Computation histories are frequently used in proofs about the capabilities of certain machines, and particularly about the undecidability of various formal languages.
Description numbers are numbers that arise in the theory of Turing machines. They are very similar to Gödel numbers, and are also occasionally called "Gödel numbers" in the literature. Given some universal Turing machine, every Turing machine can, given its encoding on that machine, be assigned a number. This is the machine's description number. These numbers play a key role in Alan Turing's proof of the undecidability of the halting problem, and are very useful in reasoning about Turing machines as well.
In computability theory, the mortality problem is a decision problem related to the halting problem. For Turing machines, the halting problem can be stated as follows: Given a Turing machine, and a word, decide whether the machine halts when run on the given word.
The history of the Church–Turing thesis ("thesis") involves the history of the development of the study of the nature of functions whose values are effectively calculable; or, in more modern terms, functions whose values are algorithmically computable. It is an important topic in modern mathematical theory and computer science, particularly associated with the work of Alonzo Church and Alan Turing.
For complete references see Turing machine.