Linear bounded automaton

Last updated

In computer science, a linear bounded automaton (plural linear bounded automata, abbreviated LBA) is a restricted form of Turing machine.

Contents

Operation

A linear bounded automaton is a Turing machine that satisfies the following three conditions:

In other words: instead of having potentially infinite tape on which to compute, computation is restricted to the portion of the tape containing the input plus the two tape squares holding the endmarkers.

An alternative, less restrictive definition is as follows:

This limitation makes an LBA a somewhat more accurate model of a real-world computer than a Turing machine, whose definition assumes unlimited tape.

The strong and the weaker definition lead to the same computational abilities of the respective automaton classes, [1] :225 by the same argument used to prove the linear speedup theorem .

LBA and context-sensitive languages

Linear bounded automata are acceptors for the class of context-sensitive languages. [1] :225–226 The only restriction placed on grammars for such languages is that no production maps a string to a shorter string. Thus no derivation of a string in a context-sensitive language can contain a sentential form longer than the string itself. Since there is a one-to-one correspondence between linear-bounded automata and such grammars, no more tape than that occupied by the original string is necessary for the string to be recognized by the automaton.

History

In 1960, John Myhill introduced an automaton model today known as deterministic linear bounded automaton. [2] In 1963, Peter Landweber proved that the languages accepted by deterministic LBAs are context-sensitive. [3] In 1964, S.-Y. Kuroda introduced the more general model of (nondeterministic) linear bounded automata, and adapted Landweber's proof to show that the languages accepted by nondeterministic linear bounded automata are precisely the context-sensitive languages. [4] [5]

LBA problems

In his seminal paper, Kuroda also stated two research challenges, which subsequently became famously known as the "LBA problems": The first LBA problem is whether the class of languages accepted by LBA is equal to the class of languages accepted by deterministic LBA. This problem can be phrased succinctly in the language of computational complexity theory as:

First LBA problem: Is NSPACE (O(n)) = DSPACE (O(n))?

The second LBA problem is whether the class of languages accepted by LBA is closed under complement.

Second LBA problem: Is NSPACE (O(n)) = co-NSPACE (O(n))?

As observed already by Kuroda, a negative answer to the second LBA problem would imply a negative answer to the first problem. But the second LBA problem has an affirmative answer, which is implied by the Immerman–Szelepcsényi theorem proved 20 years after the problem was raised. [6] [7] As of today, the first LBA problem still remains open. Savitch's theorem provides an initial insight, that NSPACE(O(n)) ⊆ DSPACE(O(n2)). [8]

Related Research Articles

<span class="mw-page-title-main">Chomsky hierarchy</span> Hierarchy of classes of formal grammars

The Chomsky hierarchy in the fields of formal language theory, computer science, and linguistics, is a containment hierarchy of classes of formal grammars. A formal grammar describes how to form strings from a language's vocabulary that are valid according to the language's syntax. Linguist Noam Chomsky theorized that four different classes of formal grammars existed that could generate increasingly complex languages. Each class can also completely generate the language of all inferior classes.

A context-sensitive grammar (CSG) is a formal grammar in which the left-hand sides and right-hand sides of any production rules may be surrounded by a context of terminal and nonterminal symbols. Context-sensitive grammars are more general than context-free grammars, in the sense that there are languages that can be described by a CSG but not by a context-free grammar. Context-sensitive grammars are less general than unrestricted grammars. Thus, CSGs are positioned between context-free and unrestricted grammars in the Chomsky hierarchy.

In formal language theory, a context-sensitive language is a language that can be defined by a context-sensitive grammar. Context-sensitive is one of the four types of grammars in the Chomsky hierarchy.

<span class="mw-page-title-main">Pushdown automaton</span> Type of automaton

In the theory of computation, a branch of theoretical computer science, a pushdown automaton (PDA) is a type of automaton that employs a stack.

<span class="mw-page-title-main">Automata theory</span> Study of abstract machines and automata

Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science with close connections to mathematical logic. The word automata comes from the Greek word αὐτόματος, which means "self-acting, self-willed, self-moving". An automaton is an abstract self-propelled computing device which follows a predetermined sequence of operations automatically. An automaton with a finite number of states is called a Finite Automaton (FA) or Finite-State Machine (FSM). The figure on the right illustrates a finite-state machine, which is a well-known type of automaton. This automaton consists of states and transitions. As the automaton sees a symbol of input, it makes a transition to another state, according to its transition function, which takes the previous state and current input symbol as its arguments.

The space complexity of an algorithm or a computer program is the amount of memory space required to solve an instance of the computational problem as a function of characteristics of the input. It is the memory required by an algorithm until it executes completely. This includes the memory space used by its inputs, called input space, and any other (auxiliary) memory it uses during execution, which is called auxiliary space.

<span class="mw-page-title-main">Complexity class</span> Set of problems in computational complexity theory

In computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory.

Sheila Adele Greibach is a researcher in formal languages in computing, automata, compiler theory and computer science. She is an Emeritus Professor of Computer Science at the University of California, Los Angeles, and notable work include working with Seymour Ginsburg and Michael A. Harrison in context-sensitive parsing using the stack automaton model.

In computational complexity theory, Savitch's theorem, proved by Walter Savitch in 1970, gives a relationship between deterministic and non-deterministic space complexity. It states that for any function ,

In computational complexity theory, non-deterministic space or NSPACE is the computational resource describing the memory space for a non-deterministic Turing machine. It is the non-deterministic counterpart of DSPACE.

In computational complexity theory, the space hierarchy theorems are separation results that show that both deterministic and nondeterministic machines can solve more problems in (asymptotically) more space, subject to certain conditions. For example, a deterministic Turing machine can solve more decision problems in space n log n than in space n. The somewhat weaker analogous theorems for time are the time hierarchy theorems.

In computational complexity theory, NL is the complexity class containing decision problems that can be solved by a nondeterministic Turing machine using a logarithmic amount of memory space.

In computational complexity theory, the Immerman–Szelepcsényi theorem states that nondeterministic space complexity classes are closed under complementation. It was proven independently by Neil Immerman and Róbert Szelepcsényi in 1987, for which they shared the 1995 Gödel Prize. In its general form the theorem states that NSPACE(s(n)) = co-NSPACE(s(n)) for any function s(n) ≥ log n. The result is equivalently stated as NL = co-NL; although this is the special case when s(n) = log n, it implies the general theorem by a standard padding argument. The result solved the second LBA problem.

In computer science, st-connectivity or STCON is a decision problem asking, for vertices s and t in a directed graph, if t is reachable from s.

In computer science, in particular in automata theory, a two-way finite automaton is a finite automaton that is allowed to re-read its input.

In computational complexity theory, NL-complete is a complexity class containing the languages that are complete for NL, the class of decision problems that can be solved by a nondeterministic Turing machine using a logarithmic amount of memory space. The NL-complete languages are the most "difficult" or "expressive" problems in NL. If a deterministic algorithm exists for solving any one of the NL-complete problems in logarithmic memory space, then NL = L.

In formal language theory, deterministic context-free languages (DCFL) are a proper subset of context-free languages. They are the context-free languages that can be accepted by a deterministic pushdown automaton. DCFLs are always unambiguous, meaning that they admit an unambiguous grammar. There are non-deterministic unambiguous CFLs, so DCFLs form a proper subset of unambiguous CFLs.

A read-only Turing machine or two-way deterministic finite-state automaton (2DFA) is class of models of computability that behave like a standard Turing machine and can move in both directions across input, except cannot write to its input tape. The machine in its bare form is equivalent to a deterministic finite automaton in computational power, and therefore can only parse a regular language.

<span class="mw-page-title-main">Structural complexity theory</span>

In computational complexity theory of computer science, the structural complexity theory or simply structural complexity is the study of complexity classes, rather than computational complexity of individual problems and algorithms. It involves the research of both internal structures of various complexity classes and the relations between different complexity classes.

References

  1. 1 2 3 4 John E. Hopcroft; Jeffrey D. Ullman (1979). Introduction to Automata Theory, Languages, and Computation . Addison-Wesley. ISBN   978-0-201-02988-8.
  2. John Myhill (June 1960). Linear Bounded Automata (WADD Technical Note). Wright Patterson AFB, Wright Air Development Division, Ohio.
  3. P.S. Landweber (1963). "Three Theorems on Phrase Structure Grammars of Type 1". Information and Control . 6 (2): 131–136. doi: 10.1016/s0019-9958(63)90169-4 .
  4. Sige-Yuki Kuroda (Jun 1964). "Classes of languages and linear-bounded automata". Information and Control. 7 (2): 207–223. doi: 10.1016/s0019-9958(64)90120-2 .
  5. Willem J. M. Levelt (2008). An Introduction to the Theory of Formal Languages and Automata. John Benjamins Publishing. pp. 126–127. ISBN   978-90-272-3250-2.
  6. Immerman, Neil (1988), "Nondeterministic space is closed under complementation" (PDF), SIAM Journal on Computing , 17 (5): 935–938, doi:10.1137/0217058, MR   0961049
  7. Szelepcsényi, Róbert (1988), "The method of forcing for nondeterministic automata", Acta Informatica , 26 (3): 279–284, doi:10.1007/BF00299636, S2CID   10838178
  8. Arora, Sanjeev; Barak, Boaz (2009). Complexity Theory: A Modern Approach. Cambridge University Press. ISBN   978-0-521-42426-4.