Here are some of the more commonly known problems that are PSPACE-complete when expressed as decision problems. This list is in no way comprehensive.
Generalized versions of:
Type inhabitation problem for simply typed lambda calculus
Integer circuit evaluation [24]
{{cite book}}
: CS1 maint: location missing publisher (link)In computational complexity theory, NP is a complexity class used to classify decision problems. NP is the set of decision problems for which the problem instances, where the answer is "yes", have proofs verifiable in polynomial time by a deterministic Turing machine, or alternatively the set of problems that can be solved in polynomial time by a nondeterministic Turing machine.
In theoretical computer science and formal language theory, a regular language is a formal language that can be defined by a regular expression, in the strict sense in theoretical computer science.
In computational complexity theory, PSPACE is the set of all decision problems that can be solved by a Turing machine using a polynomial amount of space.
In computational complexity theory, a decision problem is PSPACE-complete if it can be solved using an amount of memory that is polynomial in the input length and if every other problem that can be solved in polynomial space can be transformed to it in polynomial time. The problems that are PSPACE-complete can be thought of as the hardest problems in PSPACE, the class of decision problems solvable in polynomial space, because a solution to any one such problem could easily be used to solve any other problem in PSPACE.
In computational complexity theory, the complexity class EXPTIME (sometimes called EXP or DEXPTIME) is the set of all decision problems that are solvable by a deterministic Turing machine in exponential time, i.e., in O(2p(n)) time, where p(n) is a polynomial function of n.
In computational complexity theory, EXPSPACE is the set of all decision problems solvable by a deterministic Turing machine in exponential space, i.e., in space, where is a polynomial function of . Some authors restrict to be a linear function, but most authors instead call the resulting class ESPACE. If we use a nondeterministic machine instead, we get the class NEXPSPACE, which is equal to EXPSPACE by Savitch's theorem.
In graph theory, an isomorphism of graphsG and H is a bijection between the vertex sets of G and H
In graph theory, a vertex cover of a graph is a set of vertices that includes at least one endpoint of every edge of the graph.
In the theory of computation, a branch of theoretical computer science, a deterministic finite automaton (DFA)—also known as deterministic finite acceptor (DFA), deterministic finite-state machine (DFSM), or deterministic finite-state automaton (DFSA)—is a finite-state machine that accepts or rejects a given string of symbols, by running through a state sequence uniquely determined by the string. Deterministic refers to the uniqueness of the computation run. In search of the simplest models to capture finite-state machines, Warren McCulloch and Walter Pitts were among the first researchers to introduce a concept similar to finite automata in 1943.
In computational complexity theory, P, also known as PTIME or DTIME(nO(1)), is a fundamental complexity class. It contains all decision problems that can be solved by a deterministic Turing machine using a polynomial amount of computation time, or polynomial time.
In automata theory, an alternating finite automaton (AFA) is a nondeterministic finite automaton whose transitions are divided into existential and universal transitions. For example, let A be an alternating automaton.
In computational complexity theory, NL is the complexity class containing decision problems that can be solved by a nondeterministic Turing machine using a logarithmic amount of memory space.
In computational complexity theory, L is the complexity class containing decision problems that can be solved by a deterministic Turing machine using a logarithmic amount of writable memory space. Formally, the Turing machine has two tapes, one of which encodes the input and can only be read, whereas the other tape has logarithmic size but can be read as well as written. Logarithmic space is sufficient to hold a constant number of pointers into the input and a logarithmic number of boolean flags, and many basic logspace algorithms use the memory in this way.
In computational complexity theory, NL-complete is a complexity class containing the languages that are complete for NL, the class of decision problems that can be solved by a nondeterministic Turing machine using a logarithmic amount of memory space. The NL-complete languages are the most "difficult" or "expressive" problems in NL. If a deterministic algorithm exists for solving any one of the NL-complete problems in logarithmic memory space, then NL = L.
In algorithmic game theory, a succinct game or a succinctly representable game is a game which may be represented in a size much smaller than its normal form representation. Without placing constraints on player utilities, describing a game of players, each facing strategies, requires listing utility values. Even trivial algorithms are capable of finding a Nash equilibrium in a time polynomial in the length of such a large input. A succinct game is of polynomial type if in a game represented by a string of length n the number of players, as well as the number of strategies of each player, is bounded by a polynomial in n.
In theoretical computer science, nondeterministic constraint logic is a combinatorial system in which an orientation is given to the edges of a weighted undirected graph, subject to certain constraints. One can change this orientation by steps in which a single edge is reversed, subject to the same constraints. This is a form of reversible logic in that each sequence of edge orientation changes can be undone.
In theoretical computer science and formal language theory, the equivalence problem is the question of determining, given two representations of formal languages, whether they denote the same formal language.
In discrete mathematics and theoretical computer science, reconfiguration problems are computational problems involving reachability or connectivity of state spaces.
Games, Puzzles, and Computation is a book on game complexity, written by Robert Hearn and Erik Demaine, and published in 2009 by A K Peters. It is revised from Hearn's doctoral dissertation, which was supervised by Demaine. The Basic Library List Committee of the Mathematical Association of America has recommended it for inclusion in undergraduate mathematics libraries.