Richard Edwin Stearns | |
---|---|
Born | |
Alma mater | Carleton College (B.A.) Princeton University (Ph.D.) |
Awards | ACM Turing Award (1993) Frederick W. Lanchester Prize (1995) |
Scientific career | |
Institutions | University at Albany |
Doctoral advisor | Harold W. Kuhn |
Richard Edwin Stearns (born July 5, 1936) is an American computer scientist who, with Juris Hartmanis, received the 1993 ACM Turing Award "in recognition of their seminal paper which established the foundations for the field of computational complexity theory". [1] In 1994 he was inducted as a Fellow of the Association for Computing Machinery.
Stearns graduated with a B.A. in mathematics from Carleton College in 1958. [2] He then received his Ph.D. in mathematics from Princeton University in 1961 after completing a doctoral dissertation, titled Three person cooperative games without side payments, under the supervision of Harold W. Kuhn. [3] Stearns is now Distinguished Professor Emeritus of Computer Science at the University at Albany, which is part of the State University of New York. [4]
In formal language theory, a context-free language (CFL) is a language generated by a context-free grammar (CFG).
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm.
In theoretical computer science and formal language theory, a regular language is a formal language that can be defined by a regular expression, in the strict sense in theoretical computer science.
The ACM A. M. Turing Award is an annual prize given by the Association for Computing Machinery (ACM) for contributions of lasting and major technical importance to computer science. It is generally recognized as the highest distinction in computer science and is colloquially known as or often referred to as the "Nobel Prize of Computing".
In computational complexity theory, a decision problem is PSPACE-complete if it can be solved using an amount of memory that is polynomial in the input length and if every other problem that can be solved in polynomial space can be transformed to it in polynomial time. The problems that are PSPACE-complete can be thought of as the hardest problems in PSPACE, the class of decision problems solvable in polynomial space, because a solution to any one such problem could easily be used to solve any other problem in PSPACE.
In computational complexity theory, the complexity class EXPTIME (sometimes called EXP or DEXPTIME) is the set of all decision problems that are solvable by a deterministic Turing machine in exponential time, i.e., in O(2p(n)) time, where p(n) is a polynomial function of n.
Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science with close connections to mathematical logic. The word automata comes from the Greek word αὐτόματος, which means "self-acting, self-willed, self-moving". An automaton is an abstract self-propelled computing device which follows a predetermined sequence of operations automatically. An automaton with a finite number of states is called a Finite Automaton (FA) or Finite-State Machine (FSM). The figure on the right illustrates a finite-state machine, which is a well-known type of automaton. This automaton consists of states and transitions. As the automaton sees a symbol of input, it makes a transition to another state, according to its transition function, which takes the previous state and current input symbol as its arguments.
In computational complexity theory, the time hierarchy theorems are important statements about time-bounded computation on Turing machines. Informally, these theorems say that given more time, a Turing machine can solve more problems. For example, there are problems that can be solved with n2 time but not n time.
Juris Hartmanis was a Latvian-born American computer scientist and computational theorist who, with Richard E. Stearns, received the 1993 ACM Turing Award "in recognition of their seminal paper which established the foundations for the field of computational complexity theory".
Manuel Blum is a Venezuelan born American computer scientist who received the Turing Award in 1995 "In recognition of his contributions to the foundations of computational complexity theory and its application to cryptography and program checking".
In computational complexity theory, the Cook–Levin theorem, also known as Cook's theorem, states that the Boolean satisfiability problem is NP-complete. That is, it is in NP, and any problem in NP can be reduced in polynomial time by a deterministic Turing machine to the Boolean satisfiability problem.
In computational complexity theory, the complexity class NEXPTIME is the set of decision problems that can be solved by a non-deterministic Turing machine using time .
Neil Immerman is an American theoretical computer scientist, a professor of computer science at the University of Massachusetts Amherst. He is one of the key developers of descriptive complexity, an approach he is currently applying to research in model checking, database theory, and computational complexity theory.
In computer science, a linear bounded automaton is a restricted form of Turing machine.
Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."
In computer science, in particular in automata theory, a two-way finite automaton is a finite automaton that is allowed to re-read its input.
In formal language theory, deterministic context-free languages (DCFL) are a proper subset of context-free languages. They are the context-free languages that can be accepted by a deterministic pushdown automaton. DCFLs are always unambiguous, meaning that they admit an unambiguous grammar. There are non-deterministic unambiguous CFLs, so DCFLs form a proper subset of unambiguous CFLs.
In computational complexity theory, SC is the complexity class of problems solvable by a deterministic Turing machine in polynomial time and polylogarithmic space. It may also be called DTISP(poly, polylog), where DTISP stands for deterministic time and space. Note that the definition of SC differs from P ∩ PolyL, since for the former, it is required that a single algorithm runs in both polynomial time and polylogarithmic space; while for the latter, two separate algorithms will suffice: one that runs in polynomial time, and another that runs in polylogarithmic space..
In mathematics, logic and computer science, a formal language is called recursive if it is a recursive subset of the set of all possible finite sequences over the alphabet of the language. Equivalently, a formal language is recursive if there exists a Turing machine that, when given a finite sequence of symbols as input, always halts and accepts it if it belongs to the language and halts and rejects it otherwise. In Theoretical computer science, such always-halting Turing machines are called total Turing machines or algorithms. Recursive languages are also called decidable.