![]() | This article has an unclear citation style .(August 2012) |
![]() | This article may be too technical for most readers to understand.(July 2021) |
In computer science, unbounded nondeterminism or unbounded indeterminacy is a property of concurrency by which the delay in servicing a request can become unbounded as a result of arbitration of contention for shared resources while still guaranteeing that the request will eventually be serviced. Unbounded nondeterminism became an important issue in the development of the denotational semantics of concurrency, and later became part of research into the theoretical concept of hypercomputation [1] .
Discussion of unbounded nondeterminism tends to get involved with discussions of fairness. The basic concept is that all computation paths must be "fair" in the sense that if the machine enters a state infinitely often, it must take every possible transition from that state. This amounts to requiring that the machine be guaranteed to service a request if it can, since an infinite sequence of states will only be allowed if there is no transition that leads to the request being serviced. Equivalently, every possible transition must occur eventually in an infinite computation, although it may take an unbounded amount of time for the transition to occur. This concept is to be distinguished from the local fairness of flipping a "fair" coin, by which it is understood that it is possible for the outcome to always be heads for any finite number of steps, although as the number of steps increases, this will almost surely not happen.
An example of the role of fair or unbounded nondeterminism in the merging of strings was given by William D. Clinger, in his 1981 thesis. He defined a "fair merge" of two strings to be a third string in which each character of each string must occur eventually. He then considered the set of all fair merges of two strings merge(S, T), assuming it to be a monotone function. Then he argued that merge(⊥,1ω)⊆ merge(0,1ω), where ⊥ is the empty stream. Now merge(⊥,1ω) = {1ω}, so it must be that 1ω is an element of merge(0,1ω), a contradiction. He concluded that:
Edsger Dijkstra argued that it is impossible to implement systems with unbounded nondeterminism [3] . For this reason, Tony Hoare suggested that "an efficient implementation should try to be reasonably fair." [4]
Nondeterministic Turing machines have only bounded nondeterminism. Likewise sequential programs containing guarded commands as the only sources of nondeterminism have only bounded nondeterminism. [3] Briefly, choice nondeterminism is bounded. Gordon Plotkin gave a proof in his original paper on powerdomains:
William Clinger provided the following analysis of the above proof by Plotkin:
Spaan et al. have argued that it is possible for an unboundedly nondeterministic program to solve the halting problem; their algorithm consists of two parts defined as follows: [6]
The first part of the program requests a natural number from the second part; after receiving it, it will iterate the desired Turing machine for that many steps, and accept or reject according to whether the machine has yet halted.
The second part of the program nondeterministically chooses a natural number on request. The number is stored in a variable which is initialized to 0; then the program repeatedly chooses whether to increment the variable, or service the request. The fairness constraint requires that the request eventually be serviced, for otherwise there is an infinite loop in which only the "increment the variable" branch is ever taken.
Clearly, if the machine does halt, this algorithm has a path which accepts. If the machine does not halt, this algorithm will always reject, no matter what number the second part of the program returns.
Clinger and Carl Hewitt [ citation needed ] have developed a model (known as the Actor model) of concurrent computation with the property of unbounded nondeterminism built in [Clinger 1981; [7] ; [8] ; [9] ]; this allows computations that cannot be implemented by Turing Machines, as seen above. However, these researchers emphasize that their model of concurrent computations cannot implement any functions that are outside the class of recursive functions [ citation needed ] defined by Church, Kleene, Turing, etc. (See Indeterminacy in concurrent computation.)
Hewitt justified his use of unbounded nondeterminism by arguing that there is no bound that can be placed on how long it takes a computational circuit called an arbiter to settle (see metastability in electronics). Arbiters are used in computers to deal with the circumstance that computer clocks operate asynchronously with input from outside, e.g.., keyboard input, disk access, network input, etc. So it could take an unbounded time for a message sent to a computer to be received and in the meantime the computer could traverse an unbounded number of states.
He further argued that Electronic mail enables unbounded nondeterminism since mail can be stored on servers indefinitely before being delivered, and that data links to servers on the Internet can likewise be out of service indefinitely. This gave rise to the unbounded nondeterminism controversy. [10]
Hewitt argued that issues in fairness derive in part from the global state point of view. The oldest models of computation (e.g.. Turing machines, Post productions, the lambda calculus, etc.) are based on mathematics that makes use of a global state to represent a computational step. Each computational step is from one global state of the computation to the next global state. The global state approach was continued in automata theory for finite-state machines and push down stack machines including their nondeterministic versions. All of these models have the property of bounded nondeterminism: if a machine always halts when started in its initial state, then there is a bound on the number of states in which it can halt.
Hewitt argued that there is a fundamental difference between choices in global state nondeterminism and the arrival order indeterminacy (nondeterminism) of his Actor model. In global state nondeterminism, a "choice" is made for the "next" global state. In arrival order indeterminacy, arbitration locally decides each arrival order in an unbounded amount of time. While a local arbitration is proceeding, unbounded activity can take place elsewhere. There is no global state and consequently no "choice" to be made as to the "next" global state.
In theoretical computer science, a nondeterministic Turing machine (NTM) is a theoretical model of computation whose governing rules specify more than one possible action when in some given situations. That is, an NTM's next state is not completely determined by its action and the current symbol it sees, unlike a deterministic Turing machine.
A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algorithm.
In computability theory, a system of data-manipulation rules is said to be Turing-complete or computationally universal if it can be used to simulate any Turing machine. This means that this system is able to recognize or decode other data-manipulation rule sets. Turing completeness is used as a way to express the power of such a data-manipulation rule set. Virtually all programming languages today are Turing-complete.
In computer science, denotational semantics is an approach of formalizing the meanings of programming languages by constructing mathematical objects that describe the meanings of expressions from the languages. Other approaches providing formal semantics of programming languages include axiomatic semantics and operational semantics.
Hypercomputation or super-Turing computation is a set of hypothetical models of computation that can provide outputs that are not Turing-computable. For example, a machine that could solve the halting problem would be a hypercomputer; so too would one that could correctly evaluate every statement in Peano arithmetic.
Sheila Adele Greibach is an American researcher in formal languages in computing, automata, compiler theory and computer science. She is an Emeritus Professor of Computer Science at the University of California, Los Angeles, and notable work include working with Seymour Ginsburg and Michael A. Harrison in context-sensitive parsing using the stack automaton model.
Concurrency refers to the ability of a system to execute multiple tasks through simultaneous execution or time-sharing, sharing resources and managing interactions. Concurrency improves responsiveness, throughput, and scalability in modern computing, including:
The actor model in computer science is a mathematical model of concurrent computation that treats an actor as the basic building block of concurrent computation. In response to a message it receives, an actor can: make local decisions, create more actors, send more messages, and determine how to respond to the next message received. Actors may modify their own private state, but can only affect each other indirectly through messaging.
In computer science, a linear bounded automaton is a restricted form of Turing machine.
In theoretical computer science, Actor model theory concerns theoretical issues for the Actor model.
In computer science, the Actor model and process calculi are two closely related approaches to the modelling of concurrent digital computation. See Actor model and process calculi history.
In computer science, the Actor model, first published in 1973, is a mathematical model of concurrent computation.
In denotational semantics and domain theory, power domains are domains of nondeterministic and concurrent computations.
Indeterminacy in concurrent computation is concerned with the effects of indeterminacy in concurrent computation. Computation is an area in which indeterminacy is becoming increasingly important because of the massive increase in concurrency due to networking and the advent of many-core computer architectures. These computer systems make use of arbiters which gives rise to indeterminacy.
The actor model and process calculi share an interesting history and co-evolution.
The denotational semantics of the Actor model is the subject of denotational domain theory for Actors. The historical development of this subject is recounted in [Hewitt 2008b].
In computer science, the Actor model, first published in 1973, is a mathematical model of concurrent computation. This article reports on the later history of the Actor model in which major themes were investigation of the basic power of the model, study of issues of compositionality, development of architectures, and application to Open systems. It is the follow on article to Actor model middle history which reports on the initial implementations, initial applications, and development of the first proof theory and denotational model.
ACC0, sometimes called ACC, is a class of computational models and problems defined in circuit complexity, a field of theoretical computer science. The class is defined by augmenting the class AC0 of constant-depth "alternating circuits" with the ability to count; the acronym ACC stands for "AC with counters". Specifically, a problem belongs to ACC0 if it can be solved by polynomial-size, constant-depth circuits of unbounded fan-in gates, including gates that count modulo a fixed integer. ACC0 corresponds to computation in any solvable monoid. The class is very well studied in theoretical computer science because of the algebraic connections and because it is one of the largest concrete computational models for which computational impossibility results, so-called circuit lower bounds, can be proved.
A read-only Turing machine or two-way deterministic finite-state automaton (2DFA) is class of models of computability that behave like a standard Turing machine and can move in both directions across input, except cannot write to its input tape. The machine in its bare form is equivalent to a deterministic finite automaton in computational power, and therefore can only parse a regular language.
In mathematics, S2S is the monadic second order theory with two successors. It is one of the most expressive natural decidable theories known, with many decidable theories interpretable in S2S. Its decidability was proved by Rabin in 1969.