In computability theory a truth-table reduction is a type of reduction from a decision problem to a decision problem . To solve a problem in , the reduction describes the answer to as a boolean formula or truth table of some finite number of queries to .
Truth-table reductions are related to Turing reductions, and strictly weaker. (That is, not every Turing reduction between sets can be performed by a truth-table reduction, but every truth-table reduction can be performed by a Turing reduction.) A Turing reduction from a set B to a set A computes the membership of a single element in B by asking questions about the membership of various elements in A during the computation; it may adaptively determine which questions it asks based upon answers to previous questions. In contrast, a truth-table reduction or a weak truth-table reduction must present all of its (finitely many) oracle queries at the same time. In a truth-table reduction, the reduction also gives a boolean formula (a truth table) that, when given the answers to the queries, will produce the final answer of the reduction.
Truth-table reductions appear in a paper by Emil Post published in 1944 [1] .
This section needs expansion. You can help by adding to it. (April 2024) |
A weak truth-table reduction is one where the reduction uses the oracle answers as a basis for further computation, which may depend on the given answers but may not ask further questions of the oracle. It is so named because it weakens the constraints placed on a truth-table reduction, and provides a weaker equivalence classification; as such, a "weak truth-table reduction" can actually be more powerful than a truth-table reduction as a "tool", and perform a reduction that is not performable by truth table. Equivalently, a weak truth-table reduction is a Turing reduction for which the use of the reduction is bounded by a computable function. For this reason, they are sometimes referred to as bounded Turing (bT) reductions rather than as weak truth-table (wtt) reductions.
As every truth-table reduction is a Turing reduction, if A is truth-table reducible to B (A≤ttB), then A is also Turing reducible to B (A≤TB). Considering also one-one reducibility, many-one reducibility and weak truth-table reducibility,
or in other words, one-one reducibility implies many-one reducibility, which implies truth-table reducibility, which in turn implies weak truth-table reducibility, which in turn implies Turing reducibility.
Furthermore, A is truth-table reducible to B if and only if A is Turing reducible to B via a total functional on . The forward direction is trivial and for the reverse direction suppose is a total computable functional. To build the truth-table for computing A(n) simply search for a number m such that for all binary strings of length m converges. Such an m must exist by Kőnig's lemma since must be total on all paths through . Given such an m it is a simple matter to find the unique truth-table that gives when applied to . The forward direction fails for weak truth-table reducibility.
In the theory of computation, a branch of theoretical computer science, a pushdown automaton (PDA) is a type of automaton that employs a stack.
Computability theory, also known as recursion theory, is a branch of mathematical logic, computer science, and the theory of computation that originated in the 1930s with the study of computable functions and Turing degrees. The field has since expanded to include the study of generalized computability and definability. In these areas, computability theory overlaps with proof theory and effective descriptive set theory.
In mathematical logic, the arithmetical hierarchy, arithmetic hierarchy or Kleene–Mostowski hierarchy classifies certain sets based on the complexity of formulas that define them. Any set that receives a classification is called arithmetical. The arithmetical hierarchy was invented independently by Kleene (1943) and Mostowski (1946).
In computability theory and computational complexity theory, a many-one reduction is a reduction that converts instances of one decision problem to another decision problem using a computable function. The reduced instance is in the language if and only if the initial instance is in its language . Thus if we can decide whether instances are in the language , we can decide whether instances are in its language by applying the reduction and solving for . Thus, reductions can be used to measure the relative computational difficulty of two problems. It is said that reduces to if, in layman's terms is at least as hard to solve as . This means that any algorithm that solves can also be used as part of a program that solves .
In computational complexity theory, the polynomial hierarchy is a hierarchy of complexity classes that generalize the classes NP and co-NP. Each class in the hierarchy is contained within PSPACE. The hierarchy can be defined using oracle machines or alternating Turing machines. It is a resource-bounded counterpart to the arithmetical hierarchy and analytical hierarchy from mathematical logic. The union of the classes in the hierarchy is denoted PH.
In computational complexity theory, the Cook–Levin theorem, also known as Cook's theorem, states that the Boolean satisfiability problem is NP-complete. That is, it is in NP, and any problem in NP can be reduced in polynomial time by a deterministic Turing machine to the Boolean satisfiability problem.
In computational complexity theory, an alternating Turing machine (ATM) is a non-deterministic Turing machine (NTM) with a rule for accepting computations that generalizes the rules used in the definition of the complexity classes NP and co-NP. The concept of an ATM was set forth by Chandra and Stockmeyer and independently by Kozen in 1976, with a joint journal publication in 1981.
In computer science and mathematical logic the Turing degree or degree of unsolvability of a set of natural numbers measures the level of algorithmic unsolvability of the set.
In computability theory Post's theorem, named after Emil Post, describes the connection between the arithmetical hierarchy and the Turing degrees.
In computability theory, a Turing reduction from a decision problem to a decision problem is an oracle machine that decides problem given an oracle for . It can be understood as an algorithm that could be used to solve if it had available to it a subroutine for solving . The concept can be analogously applied to function problems.
Game theory is the branch of mathematics in which games are studied: that is, models describing human behaviour. This is a glossary of some terms of the subject.
In computability theory, a function is called limit computable if it is the limit of a uniformly computable sequence of functions. The terms computable in the limit, limit recursive and recursively approximable are also used. One can think of limit computable functions as those admitting an eventually correct computable guessing procedure at their true value. A set is limit computable just when its characteristic function is limit computable.
In the mathematical discipline of set theory, there are many ways of describing specific countable ordinals. The smallest ones can be usefully and non-circularly expressed in terms of their Cantor normal forms. Beyond that, many ordinals of relevance to proof theory still have computable ordinal notations. However, it is not possible to decide effectively whether a given putative ordinal notation is a notation or not ; various more-concrete ways of defining ordinals that definitely have notations are available.
In computability theory, many reducibility relations are studied. They are motivated by the question: given sets and of natural numbers, is it possible to effectively convert a method for deciding membership in into a method for deciding membership in ? If the answer to this question is affirmative then is said to be reducible to.
A queue machine, queue automaton, or pullup automaton (PUA) is a finite-state machine with the ability to store and retrieve data from an infinite-memory queue. Its design is similar to a pushdown automaton but differs by replacing the stack with this queue. A queue machine is a model of computation equivalent to a Turing machine, and therefore it can process the same class of formal languages.
A read-only Turing machine or two-way deterministic finite-state automaton (2DFA) is class of models of computability that behave like a standard Turing machine and can move in both directions across input, except cannot write to its input tape. The machine in its bare form is equivalent to a deterministic finite automaton in computational power, and therefore can only parse a regular language.
In computational complexity theory, a log space transducer (LST) is a type of Turing machine used for log-space reductions.
A Multitrack Turing machine is a specific type of multi-tape Turing machine.
In mathematics, a set of natural numbers is called a K-trivial set if its initial segments viewed as binary strings are easy to describe: the prefix-free Kolmogorov complexity is as low as possible, close to that of a computable set. Solovay proved in 1975 that a set can be K-trivial without being computable.
In computability theory, enumeration reducibility is a specific type of reducibility. Roughly speaking, A is enumeration-reducible to B if an enumeration of B can be algorithmically converted to an enumeration of A. In particular, if B is computably enumerable, then A also is.