Computer Informatics    
   
Table of contents
(Prev) Universal spaceUniverse Sandbox (Next)

Universal Turing machine

In computer science, a universal Turing machine (UTM) is a Turing machine that can simulate an arbitrary Turing machine on arbitrary input. The universal machine essentially achieves this by reading both the description of the machine to be simulated as well as the input thereof from its own tape. Alan Turing introduced this machine in 1936–1937. This model is considered by some (for example, Martin Davis (2000)) to be the origin of the stored program computer—used by John von Neumann (1946) for the "Electronic Computing Instrument" that now bears von Neumann's name: the von Neumann architecture. It is also known as universal computing machine, universal machine (UM), machine U, U.

In terms of computational complexity, a multi-tape universal Turing machine need only be slower by logarithmic factor compared to the machines it simulates.

Contents

Introduction

Universal Turing machine.svg

Every Turing machine computes a certain fixed partial computable function from the input strings over its alphabet. In that sense it behaves like a computer with a fixed program. However, we can encode the action table of any Turing machine in a string. Thus we can construct a Turing machine that expects on its tape a string describing an action table followed by a string describing the input tape, and computes the tape that the encoded Turing machine would have computed. Turing described such a construction in complete detail in his 1936 paper:

"It is possible to invent a single machine which can be used to compute any computable sequence. If this machine U is supplied with a tape on the beginning of which is written the S.D ["standard description" of an action table] of some computing machine M, then U will compute the same sequence as M." [1]


Stored-program computer

Davis makes a persuasive argument that Turing's conception of what is now known as "the stored-program computer", of placing the "action table"—the instructions for the machine—in the same "memory" as the input data, strongly influenced John von Neumann's conception of the first American discrete-symbol (as opposed to analog) computer—the EDVAC. Davis quotes Time magazine to this effect, that "everyone who taps at a keyboard... is working on an incarnation of a Turing machine," and that "John von Neumann [built] on the work of Alan Turing" (Davis 2000:193 quoting Time magazine of 29 March 1999).

Davis makes a case that Turing's Automatic Computing Engine (ACE) computer "anticipated" the notions of microprogramming (microcode) and RISC processors (Davis 2000:188). Knuth cites Turing's work on the ACE computer as designing "hardware to facilitate subroutine linkage" (Knuth 1973:225); Davis also references this work as Turing's use of a hardware "stack" (Davis 2000:237 footnote 18).

As the Turing Machine was encouraging the construction of computers, the UTM was encouraging the development of the fledgling computer sciences. An early, if not the very first, assembler was proposed "by a young hot-shot programmer" for the EDVAC (Davis 2000:192). Von Neumann's "first serious program ... [was] to simply sort data efficiently" (Davis 2000:184). Knuth observes that the subroutine return embedded in the program itself rather than in special registers is attributable to von Neumann and Goldstine.[2] Knuth furthermore states that

"The first interpretive routine may be said to be the "Universal Turing Machine" ... Interpretive routines in the conventional sense were mentioned by John Mauchly in his lectures at the Moore School in 1946 ... Turing took part in this development also; interpretive systems for the Pilot ACE computer were written under his direction" (Knuth 1973:226).

Davis briefly mentions operating systems and compilers as outcomes of the notion of program-as-data (Davis 2000:185).

Some, however, might raise issues with this assessment. At the time (mid-1940s to mid-1950s) a relatively small cadre of researchers were intimately involved with the architecture of the new "digital computers". Hao Wang (1954), a young researcher at this time, made the following observation:

Turing's theory of computable functions antedated but has not much influenced the extensive actual construction of digital computers. These two aspects of theory and practice have been developed almost entirely independently of each other. The main reason is undoubtedly that logicians are interested in questions radically different from those with which the applied mathematicians and electrical engineers are primarily concerned. It cannot, however, fail to strike one as rather strange that often the same concepts are expressed by very different terms in the two developments." (Wang 1954, 1957:63)

Wang hoped that his paper would "connect the two approaches." Indeed, Minsky confirms this: "that the first formulation of Turing-machine theory in computer-like models appears in Wang (1957)" (Minsky 1967:200). Minsky goes on to demonstrate Turing equivalence of a counter machine.

With respect to the reduction of computers to simple Turing equivalent models (and vice versa), Minsky's designation of Wang as having made "the first formulation" is open to debate. While both Minsky's paper of 1961 and Wang's paper of 1957 are cited by Shepherdson and Sturgis (1963), they also cite and summarize in some detail the work of European mathematicians Kaphenst (1959), Ershov (1959), and Péter (1958). The names of mathematicians Hermes (1954, 1955, 1961) and Kaphenst (1959) appear in the bibliographies of both Sheperdson-Sturgis (1963) and Elgot-Robinson (1961). Two other names of importance are Canadian researchers Melzak (1961) and Lambek (1961). For much more see Turing machine equivalents; references can be found at Register machine.

Mathematical theory

With this encoding of action tables as strings it becomes possible in principle for Turing machines to answer questions about the behaviour of other Turing machines. Most of these questions, however, are undecidable, meaning that the function in question cannot be calculated mechanically. For instance, the problem of determining whether an arbitrary Turing machine will halt on a particular input, or on all inputs, known as the Halting problem, was shown to be, in general, undecidable in Turing's original paper. Rice's theorem shows that any non-trivial question about the output of a Turing machine is undecidable.

A universal Turing machine can calculate any recursive function, decide any recursive language, and accept any recursively enumerable language. According to the Church-Turing thesis, the problems solvable by a universal Turing machine are exactly those problems solvable by an algorithm or an effective method of computation, for any reasonable definition of those terms. For these reasons, a universal Turing machine serves as a standard against which to compare computational systems, and a system that can simulate a universal Turing machine is called Turing complete.

An abstract version of the universal Turing machine is the universal function, a computable function which can be used to calculate any other computable function. The utm theorem proves the existence of such a function.

Efficiency

Without loss of generality, the input of Turing machine can be assumed to be in the alphabet {0, 1}; any other finite alphabet can be encoded over {0, 1}. The behavior of a Turing machine M is determined by its transition function. This function can be easily encoded as a string over the alphabet {0, 1} as well. The size of the alphabet of M, the number of tapes it has, and the size of the state space can be deduced from the transition function’s table. The distinguished states and symbols can be identified by their position, e.g. the first two states can by convention be the start and stop states. Consequently, every Turing machine can be encoded as a string over the alphabet {0, 1}. Additionally, we convene that every invalid encoding maps to a trivial Turing machine that immediately halts, and that every Turing machine can have an infinite number of encodings by padding the encoding with an arbitrary number of (say) 1's at the end, just like comments work in a programming language. It should be no surprise that we can achieve this encoding given the existence of a Gödel number and computational equivalence between Turing machines and μ-recursive functions. Similarly, our construction associates to every binary string α, a Turing machine Mα.

Starting from the above encoding, in 1966 F. C. Hennie and R. E. Stearns showed that given a Turing machine Mα that halts on input x within N steps, then there exists a multi-tape universal Turing machine that halts on inputs α, x (given on different tapes) in CN log N, where C is a machine-specific constant that does not depend on the length of the input x, but does depend on M's alphabet size, number of tapes, and number of states. Effectively this is an O(N log N) simulation.[3]

Smallest machines

When Alan Turing came up with the idea of a universal machine he had in mind the simplest computing model powerful enough to calculate all possible functions which can be calculated. Claude Shannon first explicitly posed the question of finding the smallest possible universal Turing machine in 1956. He showed that two symbols were sufficient so long as enough states were used (or vice-versa), and that it was always possible to exchange states by symbols.

Marvin Minsky discovered a 7-state 4-symbol universal Turing machine in 1962 using 2-tag systems. Other small universal Turing machines have since been found by Yurii Rogozhin and others by extending this approach of tag system simulation. If we denote by (m, n) the class of UTMs with m states and n symbols the following tuples have been found: (15, 2), (9, 3), (6, 4), (5, 5), (4, 6), (3, 9), and (2, 18).[4][5][6] Rogozhin's (4, 6) machine uses only 22 instructions, and no standard UTM of lesser descriptional complexity is known.

However, generalizing the standard Turing machine model admits even smaller UTMs. One such generalization is to allow an infinitely repeated word on one or both sides of the Turing machine input, thus extending the definition of universality and known as "semi-weak" or "weak" universality, respectively. Small weakly universal Turing machines that simulate the Rule 110 cellular automaton have been given for the (6, 2), (3, 3), and (2, 4) state-symbol pairs.[7] The proof of universality for Wolfram's 2-state 3-symbol Turing machine further extends the notion of weak universality by allowing certain non-periodic initial configurations. Other variants on the standard Turing machine model that yield small UTMs include machines with multiple tapes or tapes of multiple dimension, and machines coupled with a finite automaton.

Example of universal-machine coding

For those who would undertake the challenge of designing a UTM exactly as Turing specified see the article by Davies in Copeland (2004:103ff). Davies corrects the errors in the original and shows what a sample run would look like. He claims to have successfully run a (somewhat simplified) simulation.

The following example is taken from Turing (1936). For more about this example see the page Turing machine examples.

Turing used seven symbols { A, C, D, R, L, N,  } to encode each 5-tuple; as described in the article Turing machine, his 5-tuples are only of types N1, N2, and N3. The number of each "m-configuration" (instruction, state) is represented by "D" followed by a unary string of A's, i.e. "q3" = DAAA. In a similar manner he encodes the symbols blank as "D", the symbol "0" is "DC", the symbol "1" as DCC, etc. The symbols "R", "L", and "N" remain as is.

After encoding each 5-tuple is then "assembled" into a string in order as shown in the following table:

Current m-configurationTape symbolPrint-operationTape-motionFinal m-configurationCurrent m-configuration codeTape symbol codePrint-operation codeTape-motion codeFinal m-configuration code5-tuple assembled code
           
q1blankP0Rq2 DADDCRDAADADDCRDAA
q2blankERq3 DAADDRDAAADAADDRDAAA
q3blankP1Rq4 DAAADDCCRDAAAADAAADDCCRDAAAA
q4blankERq1 DAAAADDRDADAAAADDRDA

Finally, the codes for all four 5-tuples are strung together into a code started by ";" and separated by ";" i.e.:

;DADDCRDAA;DAADDRDAAA;DAAADDCCRDAAAA;DAAAADDRDA

This code he placed on alternate squares—the "F-squares" -- leaving the "E-squares" (those liable to erasure) empty. The final assembly of the code on the tape for the U-machine consists of placing two special symbols ("e") one after the other, then the code separated out on alternate squares, and lastly the double-colon symbol "::" (blanks shown here with "." for clarity):

ee.;.D.A.D.D.C.R.D.A.A.;.D.A.A.D.D.R.D.A.A.A.;.D.A.A.A.D.D.C.C.R.D.A.A.A.A.;.D.A.A.A.A.D.D.R.D.A.::......

The U-machine's action-table (state-transition table) is responsible for decoding the symbols. Turing's action table keeps track of its place with markers "u", "v", "x", "y", "z" by placing them in "E-squares" to the right of "the marked symbol" -- for example, to mark the current instruction z is placed to the right of ";" x is keeping the place with respect to the current "m-configuration" DAA. The U-machine's action table will shuttle these symbols around (erasing them and placing them in different locations) as the computation progresses:

ee.; .D.A.D.D.C.R.D.A.A. ; zD.A.AxD.D.R.D.A.A.A.;.D.A.A.A.D.D.C.C.R.D.A.A.A.A.;.D.A.A.A.A.D.D.R.D.A.::......

Turing's action-table for his U-machine is very involved.

A number of other commentators (notably Penrose 1989) provide examples of ways to encode instructions for the Universal machine. As does Penrose, most commentators use only binary symbols i.e. only symbols { 0, 1 }, or { blank, mark | }. Penrose goes further and writes out his entire U-machine code (Penrose 1989:71–73). He asserts that it truly is a U-machine code, an enormous number that spans almost 2 full pages of 1's and 0's. For readers interested in simpler encodings for the Post-Turing machine the discussion of Davis in Steen (Steen 1980:251ff) may be useful.

See also

  • Alternating Turing machine
  • Von Neumann universal constructor an attempt to build a self-replicating Turing machine

References

  1. ^ Boldface replacing script. Turing 1936 in Davis 1965:127-128. An example of Turing's notion of S.D is given at the end of this article.
  2. ^ In particular: Burks, Goldstine, von Neumann (1946), Preliminary discussion of the logical design of an electronic computing instrument, reprinted in Bell and Newell 1971
  3. ^ Arora and Barak, 2009, Theorem 1.9
  4. ^ Rogozhin, 1996
  5. ^ Kudlek and Rogozhin, 2002
  6. ^ Neary and Woods, 2009
  7. ^ Neary and Woods, 2009b

General references

  • Arora, Sanjeev; Barak, Boaz, "Complexity Theory: A Modern Approach", Cambridge University Press, 2009, ISBN 978-0-521-42426-4, section 1.4, "Machines as strings and the universal Turing machine" and 1.7, "Proof of theorem 1.9"

Seminal papers

  • F. C. Hennie and R. E. Stearns. Two-tape simulation of multitape Turing machines. JACM, 13(4):533–546, 1966.

Other references

  • Copeland, Jack, ed. (2004), The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life plus The Secrets of Enigma, Oxford UK: Oxford University Press, ISBN 0-19-825079-7
  • Davis, Martin (1980), "What is Computation?", in Steen, Lynn Arthur, Mathematics Today: Twelve Informal Essays, New York NY: Vintage Books (Random House), ISBN 978-0-394-74503-9.
  • Davis, Martin (2000), Engines of Logic: Mathematicians and the origin of the Computer (1st ed.), New York NY: W. W. Norton & Company, ISBN 0-393-32229-7, (pb.)
  • Goldstine, Herman H., and von Neumann, John, "Planning and Coding of the Problems for an Electronic Computing Instrument", Rep. 1947, Institute for Advanced Study, Princeton. Reprinted on pp. 92–119 in Bell, C. Gordon and Newell, Allen (1971), Computer Structures: Readings and Examples, McGraw-Hill Book Company, New York. ISBN 0-07-004357-4}.
  • Herken, Rolf (1995), The Universal Turing Machine – A Half-Century Survey, Springer Verlag, ISBN 3-211-82637-8
  • Knuth, Donald E.. (First Edition 1968), The Art of Computer Programming Second Edition, Volume 1/Fundamental Algorithms (2nd, 1973 ed.), Addison-Wesley Publishing Company The first of Knuth's series of three texts.
  • Kudlek, Manfred; Rogozhin, Yurii (2002), "A universal Turing machine with 3 states and 9 symbols", LNCS 2295: 311–318
  • Minsky, Marvin (1962), Size and Structure of Universal Turing Machines using Tag Systems, Recursive Function Theory, , Proc. Symp. Pure Mathematics (Providence RI: American Mathematical Society) 5: 229–238
  • Neary, Turlough; Woods, Damien (2009), "Four Small Universal Turing Machines", Fundamenta Informaticae 91 (1)
  • Neary, Turlough; Woods, Damien (2009b), "Small Weakly Universal Turing Machines", 17th International Symposium on Fundamentals of Computation Theory, Springer LNCS 5699, pp. 262–273
  • Penrose, Roger (1989), The Emperor's New Mind, Oxford UK: Oxford University Press, ISBN 0-19-851973-7, (hc.), (pb.)
  • Rogozhin, Yurii (1996), "Small Universal Turing Machines", Theoretical Computer Science 168 (2): 215–240, doi:10.1016/S0304-3975(96)00077-1
  • Shannon, Claude (1956), "A Universal Turing Machine with Two Internal States", Automata Studies, Princeton, NJ: Princeton University Press, pp. 157–165
  • Turing, Alan (1936), On Computable Numbers, With an Application to the Entscheidungsproblem, , Proceedings of the London Mathematical Society 42 (2), http://www.abelard.org/turpap2/tp2-ie .asp (and Turing, A.M. (1938), "On Computable Numbers, with an Application to the Entscheidungsproblem: A correction", Proceedings of the London Mathematical Society, 2 43 (6): 544–6, 1937, doi:10.1112/plms/s2-43.6.544). Reprinted in Martin Davis ed. (1965) The Undecidable, Raven Press, Hewlett NY pp. 115–154; with corrections to Turing's UTM by Emil Post cf footnote 11 in Davis 1965:299.
(Prev) Universal spaceUniverse Sandbox (Next)