Turing machine examples

Last updated

The following are examples to supplement the article Turing machine.

Contents

Turing's very first example

The following table is Turing's very first example (Alan Turing 1937):

"1. A machine can be constructed to compute the sequence 0 1 0 1 0 1..." (0 <blank> 1 <blank> 0...) (Undecidable p. 119)
ConfigurationBehavior
m-configuration
(state)
Tape symbolTape operationsFinal m-configuration
(state)
bblankP0, Rc
cblankRe
eblankP1, Rf
fblankRb

With regard to what actions the machine actually does, Turing (1936) (Undecidable p. 121) states the following:

"This [example] table (and all succeeding tables of the same kind) is to be understood to mean that for a configuration described in the first two columns the operations in the third column are carried out successively, and the machine then goes over into the m-configuration in the final column." (Undecidable p. 121)

He makes this very clear when he reduces the above table to a single instruction called "b" (Undecidable p. 120), but his instruction consists of 3 lines. Instruction "b" has three different symbol possibilities {None, 0, 1}. Each possibility is followed by a sequence of actions until we arrive at the rightmost column, where the final m-configuration is "b":

Current m-configuration (instruction)Tape symbolOperations on the tapeFinal m-configuration (instruction)
bNoneP0b
b0R, R, P1b
b1R, R, P0b

As observed by a number of commentators including Turing (1937) himself, (e.g., Post (1936), Post (1947), Kleene (1952), Wang (1954)) the Turing instructions are not atomic further simplifications of the model can be made without reducing its computational power; see more at Post–Turing machine.

As stated in the article Turing machine, Turing proposed that his table be further atomized by allowing only a single print/erase followed by a single tape movement L/R/N. He gives us this example of the first little table converted (Undecidable, p. 127):

Current m-configuration (Turing state)Tape symbolPrint-operationTape-motionFinal m-configuration (Turing state)
q1blankP0Rq2
q2blankP blank, i.e. ERq3
q3blankP1Rq4
q4blankP blank, i.e. ERq1

Turing's statement still implies five atomic operations. At a given instruction (m-configuration) the machine:

  1. observes the tape-symbol underneath the head
  2. based on the observed symbol goes to the appropriate instruction-sequence to use
  3. prints symbol Sj or erases or does nothing
  4. moves tape left, right or not at all
  5. goes to the final m-configuration for that symbol

Because a Turing machine's actions are not atomic, a simulation of the machine must atomize each 5-tuple into a sequence of simpler actions. One possibility used in the following examples of "behaviors" of his machine is as follows:

(qi) Test tape-symbol under head: If the symbol is S0 go to qi.01, if symbol S1 go to qi.11, if symbol S2 go to qi.21, etc.
(qi.01) print symbol Sj0 or erase or do nothing then go to qi.02
(qi.02) move tape left or right nor not at all then go to qm0
(qi.11) print symbol Sj1 or erase or do nothing then go to qi.12
(qi.12) move tape left or right nor not at all then go to qm1
(qi.21) print symbol Sj2 or erase or do nothing then go to qi.22
(qi.22) move tape left or right nor not at all then go to qm2
(etc all symbols must be accounted for)

So-called "canonical" finite state machines do the symbol tests "in parallel"; see more at microprogramming.

In the following example of what the machine does, we will note some peculiarities of Turing's models:

"The convention of writing the figures only on alternate squares is very useful: I shall always make use of it." (Undecidable p. 121)

Thus when printing he skips every other square. The printed-on squares are called F-squares; the blank squares in between may be used for "markers" and are called "E-squares" as in "liable to erasure." The F-squares in turn are his "Figure squares" and will only bear the symbols 1 or 0 symbols he called "figures" (as in "binary numbers").

In this example the tape starts out "blank", and the "figures" are then printed on it. For brevity only the TABLE-states are shown here:

SequenceInstruction identifierHead
..................
11..................
22.....0............
33......0...........
44.....1.0..........
51......1.0.........
62.....0.1.0........
73......0.1.0.......
84.....1.0.1.0......
91......1.0.1.0.....
102.....0.1.0.1.0....
113......0.1.0.1.0...
124.....1.0.1.0.1.0..
131......1.0.1.0.1.0.
142.....0.1.0.1.0.1.0

The same "run" with all the intermediate tape-printing and movements is shown here:

Turing machine Turing's first machine.JPG

A close look at the table reveals certain problems with Turing's own examplenot all the symbols are accounted for.

For example, suppose his tape was not initially blank. What would happen? The Turing machine would read different values than the intended values.

A copy subroutine

This is a very important subroutine used in the "multiply" routine.

The example Turing machine handles a string of 0s and 1s, with 0 represented by the blank symbol. Its task is to double any series of 1s encountered on the tape by writing a 0 between them. For example, when the head reads "111", it will write a 0, then "111". The output will be "1110111".

In order to accomplish its task, this Turing machine will need only 5 states of operation, which are called {s1, s2, s3, s4, s5}. Each state does 4 actions:

  1. Read the symbol under the head
  2. Write the output symbol decided by the state
  3. Move the tape to the left or to the right decided by the state
  4. Switch to the following state decided by the current state
Initial m-configuration

(current instruction)

Tape symbolPrint operationTape motionFinal m-configuration

(next instruction)

s10NNH
s11ERs2
s20ERs3
s21P1Rs2
s30P1Ls4
s31P1Rs3
s40ELs5
s41P1Ls4
s50P1Rs1
s51P1Ls5
H

Print Operation: Prints symbol S or Erases or does Nothing


A "run" of the machine sequences through 16 machine-configurations (aka Turing states):

SequenceInstruction identifierHead
1s100001100000
2s200000100000
3s200000010000
4s300000001000
5s400001010000
6s500010100000
7s500101000000
8s100010110000
9s200001001000
10s300000100100
11s300000010010
12s400001100100
13s400011001000
14s500110010000
15s100011011000
16H00011011000

The behavior of this machine can be described as a loop: it starts out in s1, replaces the first 1 with a 0, then uses s2 to move to the right, skipping over 1s and the first 0 encountered. s3 then skips over the next sequence of 1s (initially there are none) and replaces the first 0 it finds with a 1. s4 moves back to the left, skipping over 1s until it finds a 0 and switches to s5. s5 then moves to the left, skipping over 1s until it finds the 0 that was originally written by s1.

It replaces that 0 with a 1, moves one position to the right and enters s1 again for another round of the loop.

This continues until s1 finds a 0 (this is the 0 in the middle of the two strings of 1s) at which time the machine halts.

Alternative description

Another description sees the problem as how to keep track of how many "1"s there are. We can't use one state for each possible number (a state for each of 0,1,2,3,4,5,6 etc), because then we'd need infinite states to represent all the natural numbers, and the state machine is finite - we'll have to track this using the tape in some way.

The basic way it works is by copying each "1" to the other side, by moving back and forth - it is intelligent enough to remember which part of the trip it is on. In more detail, it carries each "1" across to the other side, by recognizing the separating "0" in the middle, and recognizing the "0" on the other side to know it's reached the end. It comes back using the same method, detecting the middle "0", and then the "0" on the original side. This "0" on the original side is the key to the puzzle of how it keeps track of the number of 1's.

The trick is that before carrying the "1", it marks that digit as "taken" by replacing it with an "0". When it returns, it fills that "0" back in with a "1", then moves on to the next one, marking it with an "0" and repeating the cycle, carrying that "1" across and so on. With each trip across and back, the marker "0" moves one step closer to the centre. This is how it keeps track of how many "1"'s it has taken across.

When it returns, the marker "0" looks like the end of the collection of "1"s to it - any "1"s that have already been taken across are invisible to it (on the other side of the marker "0") and so it is as if it is working on an (N-1) number of "1"s - similar to a proof by mathematical induction.

A full "run" showing the results of the intermediate "motions". To see it better click on the image, then click on the higher resolution download:

Turing machine copy example.JPG

3-state Busy Beaver

The following Turing table of instructions was derived from Peterson (1988) page 198, Figure 7.15. Peterson moves the head; in the following model the tape moves.

Tape symbolCurrent state ACurrent state BCurrent state C
Write symbolMove tapeNext stateWrite symbolMove tapeNext stateWrite symbolMove tapeNext state
01RB1LA1LB
11LC1RB1NHALT

The "state" drawing of the 3-state busy beaver shows the internal sequences of events required to actually perform "the state". As noted above Turing (1937) makes it perfectly clear that this is the proper interpretation of the 5-tuples that describe the instruction (Undecidable, p. 119). For more about the atomization of Turing 5-tuples see Post–Turing machine:

State diagram 3 state busy beaver.JPG

The following table shows the "compressed" run just the Turing states:

SequenceInstruction identifierHead
1b00000000000000
2B00000001000000
3A00000110000000
4C00001100000000
5B00011100000000
6A00111100000000
7B00011111000000
8B00001111100000
9B00000111110000
10B00000011111000
11B00000001111100
12A00000111111000
13C00001111110000
14H00001111110000

The full "run" of the 3-state busy beaver. The resulting Turing-states (what Turing called the "m-configurations" "machine-configurations") are shown highlighted in grey in column A, and also under the machine's instructions (columns AF-AU)):

Turing machine example 3 state busy beaver.JPG

Related Research Articles

<span class="mw-page-title-main">Algorithm</span> Sequence of operations for a task

In mathematics and computer science, an algorithm is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes and deduce valid inferences, achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".

In complexity theory and computability theory, an oracle machine is an abstract machine used to study decision problems. It can be visualized as a Turing machine with a black box, called an oracle, which is able to solve certain problems in a single operation. The problem can be of any complexity class. Even undecidable problems, such as the halting problem, can be used.

<span class="mw-page-title-main">Turing machine</span> Computation model defining an abstract machine

A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algorithm.

The Post correspondence problem is an undecidable decision problem that was introduced by Emil Post in 1946. Because it is simpler than the halting problem and the Entscheidungsproblem it is often used in proofs of undecidability.

In theoretical computer science, the busy beaver game aims at finding a terminating program of a given size that produces the most output possible. Since an endlessly looping program producing infinite output or running for infinite time is easily conceived, such programs are excluded from the game. This is sometimes also formulated as finding the machine that runs for the longest time, but both games are similarly difficult.

In computer science, a universal Turing machine (UTM) is a Turing machine capable of computing any computable sequence, as described by Alan Turing in his seminal paper "On Computable Numbers, with an Application to the Entscheidungsproblem". Common sense might say that a universal machine is impossible, but Turing proves that it is possible. He suggested that we may compare a man in the process of computing a real number to a machine which is only capable of a finite number of conditions q 1: q 2. .... qI; which will be called "m-configurations". He then described the operation of such machine, as described below, and argued:

It is my contention that these operations include all those which are used in the computation of a number.

Computability is the ability to solve a problem in an effective manner. It is a key topic of the field of computability theory within mathematical logic and the theory of computation within computer science. The computability of a problem is closely linked to the existence of an algorithm to solve the problem.

In mathematical logic and theoretical computer science, a register machine is a generic class of abstract machines used in a manner similar to a Turing machine. All models of register machines are Turing equivalent.

In computer science, random-access machine is a model of computation that describes an abstract machine in the general class of register machines. The RA-machine is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. The 'registers' are intuitively equivalent to main memory of a common computer, except for the additional ability of registers to store natural numbers of any size. Like the counter machine, the RA-machine contains the execution instructions in the finite-state portion of the machine.

A Post–Turing machine is a "program formulation" of a type of Turing machine, comprising a variant of Emil Post's Turing-equivalent model of computation. Post's model and Turing's model, though very similar to one another, were developed independently. Turing's paper was received for publication in May 1936, followed by Post's in October. A Post–Turing machine uses a binary alphabet, an infinite sequence of binary storage locations, and a primitive programming language with instructions for bi-directional movement among the storage locations and alteration of their contents one at a time. The names "Post–Turing program" and "Post–Turing machine" were used by Martin Davis in 1973–1974. Later in 1980, Davis used the name "Turing–Post program".

Turing's proof is a proof by Alan Turing, first published in November 1936 with the title "On Computable Numbers, with an Application to the Entscheidungsproblem". It was the second proof of the negation of Hilbert's Entscheidungsproblem; that is, the conjecture that some purely mathematical yes–no questions can never be answered by computation; more technically, that some decision problems are "undecidable" in the sense that there is no single algorithm that infallibly gives a correct "yes" or "no" answer to each instance of the problem. In Turing's own words: "what I shall prove is quite different from the well-known results of Gödel ... I shall now show that there is no general method which tells whether a given formula U is provable in K [Principia Mathematica]".

A Turing machine is a hypothetical computing device, first conceived by Alan Turing in 1936. Turing machines manipulate symbols on a potentially infinite strip of tape according to a finite table of rules, and they provide the theoretical underpinnings for the notion of a computer algorithm.

The following article is a supplement to the article Turing machine.

As presented by Hao Wang, his basic machine B is an extremely simple computational model equivalent to the Turing machine. It is "the first formulation of a Turing-machine theory in terms of computer-like models". With only 4 sequential instructions it is very similar to, but even simpler than, the 7 sequential instructions of the Post–Turing machine. In the same paper, Wang introduced a variety of equivalent machines, including what he called the W-machine, which is the B-machine with an "erase" instruction added to the instruction set.

Algorithm characterizations are attempts to formalize the word algorithm. Algorithm does not have a generally accepted formal definition. Researchers are actively working on this problem. This article will present some of the "characterizations" of the notion of "algorithm" in more detail.

There are many variants of the counter machine, among them those of Hermes, Ershov, Péter, Minsky, Lambek, Shepherdson and Sturgis, and Schönhage. These are explained below.

In computer science, a computation history is a sequence of steps taken by an abstract machine in the process of computing its result. Computation histories are frequently used in proofs about the capabilities of certain machines, and particularly about the undecidability of various formal languages.

Description numbers are numbers that arise in the theory of Turing machines. They are very similar to Gödel numbers, and are also occasionally called "Gödel numbers" in the literature. Given some universal Turing machine, every Turing machine can, given its encoding on that machine, be assigned a number. This is the machine's description number. These numbers play a key role in Alan Turing's proof of the undecidability of the halting problem, and are very useful in reasoning about Turing machines as well.

In computability theory, the mortality problem is a decision problem related to the halting problem. For Turing machines, the halting problem can be stated as follows: Given a Turing machine, and a word, decide whether the machine halts when run on the given word.

The history of the Church–Turing thesis ("thesis") involves the history of the development of the study of the nature of functions whose values are effectively calculable; or, in more modern terms, functions whose values are algorithmically computable. It is an important topic in modern mathematical theory and computer science, particularly associated with the work of Alonzo Church and Alan Turing.

References

For complete references see Turing machine.

  • Alan Turing, 1937, On Computable Numbers, with an Application to the Entscheidungsproblem, pp. 116ff, with brief commentary by Davis on page 115.
  • Alan Turing, 1937, On Computable Numbers, with an Application to the Entscheidungsproblem. A Correction, p. 152-154.