In computer science and recursion theory the McCarthy Formalism (1963) of computer scientist John McCarthy clarifies the notion of recursive functions by use of the IF-THEN-ELSE construction common to computer science, together with four of the operators of primitive recursive functions: zero, successor, equality of numbers and composition. The conditional operator replaces both primitive recursion and the mu-operator.
McCarthy (1960) described his formalism this way: [1]
In his 1967 Computation: Finite and Infinite Machines, Marvin Minsky in his § 10.6 Conditional Expressions: The McCarthy Formalism describes the "formalism" as follows:
Minsky uses the following operators in his demonstrations: [2]
From these he shows how to derive the predecessor function (i.e. DECREMENT); with this tool he derives the minimization operator necessary for "general" recursion, as well as primitive-recursive definitions.
In his 1952 Introduction of Meta-Mathematics Stephen Kleene provides a definition of what it means to be a primitive recursive function:
In other words, given a "basis" function (it can be a constant such as 0), primitive recursion uses either the base or the previous value of the function to produce the value of the function; primitive recursion is sometimes called mathematical induction
Minsky (above) is describing a two-CASE operator. A demonstration that the nested IF-THEN-ELSE—the "case statement" (or "switch statement")--is primitive recursive can be found in Kleene 1952:229 [4] at "#F ('mutually-exclusive predicates')". The CASE operator behaves like a logical multiplexer and is simply an extension of the simpler two-case logical operator sometimes called AND-OR-SELECT (see more at Propositional formula). The CASE operator for three cases would be verbally described as: "If X is CASE 1 then DO "p" else if X is CASE 2 then do "q" else if X is CASE "3" then do "r" else do "default".
Boolos-Burgess-Jeffrey 2002 observe that in a particular instance the CASE operator, or a sequence of nested IF-THEN-ELSE statements, must be both mutually exclusive, meaning that only one "case" holds (is true), and collectively exhaustive, meaning every possible situation or "case" is "covered". These requirements are a consequence of the determinacy of Propositional logic; the correct implementation requires the use of truth tables and Karnaugh maps to specify and simplify the cases; see more at Propositional formula. The authors point out the power of "definition by cases":
They prove, in particular, that the processes of substitution, graph relation (similar to the identity relation that plucks out (the value of) a particular variable from a list of variables), negation (logical NOT), conjunction (logical AND), disjunction (logical OR), bounded universal quantification, or bounded existential quantification can be used together with definition by cases to create new primitive recursive functions (cf Boolos-Burgess-Jeffrey 2002:74-77).
In mathematics and computer science, an algorithm is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes and deduce valid inferences, achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".
First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.
The Principia Mathematica is a three-volume work on the foundations of mathematics written by mathematician–philosophers Alfred North Whitehead and Bertrand Russell and published in 1910, 1912, and 1913. In 1925–1927, it appeared in a second edition with an important Introduction to the Second Edition, an Appendix A that replaced ✸9 and all-new Appendix B and Appendix C. PM is not to be confused with Russell's 1903 The Principles of Mathematics. PM was originally conceived as a sequel volume to Russell's 1903 Principles, but as PM states, this became an unworkable suggestion for practical and philosophical reasons: "The present work was originally intended by us to be comprised in a second volume of Principles of Mathematics... But as we advanced, it became increasingly evident that the subject is a very much larger one than we had supposed; moreover on many fundamental questions which had been left obscure and doubtful in the former work, we have now arrived at what we believe to be satisfactory solutions."
In computability theory, a primitive recursive function is, roughly speaking, a function that can be computed by a computer program whose loops are all "for" loops. Primitive recursive functions form a strict subset of those general recursive functions that are also total functions.
In mathematical logic and computer science, a general recursive function, partial recursive function, or μ-recursive function is a partial function from natural numbers to natural numbers that is "computable" in an intuitive sense – as well as in a formal one. If the function is total, it is also called a total recursive function. In computability theory, it is shown that the μ-recursive functions are precisely the functions that can be computed by Turing machines. The μ-recursive functions are closely related to primitive recursive functions, and their inductive definition (below) builds upon that of the primitive recursive functions. However, not every total recursive function is a primitive recursive function—the most famous example is the Ackermann function.
In computability theory, Kleene's recursion theorems are a pair of fundamental results about the application of computable functions to their own descriptions. The theorems were first proved by Stephen Kleene in 1938 and appear in his 1952 book Introduction to Metamathematics. A related theorem, which constructs fixed points of a computable function, is known as Rogers's theorem and is due to Hartley Rogers, Jr.
In mathematical logic, propositional logic and predicate logic, a well-formed formula, abbreviated WFF or wff, often simply formula, is a finite sequence of symbols from a given alphabet that is part of a formal language. A formal language can be identified with the set of formulas in the language.
In mathematical logic and theoretical computer science a register machine is a generic class of abstract machines used in a manner similar to a Turing machine. All the models are Turing equivalent.
In computer science, random-access machine (RAM) is an abstract machine in the general class of register machines. The RAM is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. Like the counter machine, The RAM has its instructions in the finite-state portion of the machine.
In computability theory, the μ-operator, minimization operator, or unbounded search operator searches for the least natural number with a given property. Adding the μ-operator to the primitive recursive functions makes it possible to define all computable functions.
In computer programming languages, a switch statement is a type of selection control mechanism used to allow the value of a variable or expression to change the control flow of program execution via search and map.
In computability theory the S m
n theorem, is a basic result about programming languages. It was first proved by Stephen Cole Kleene (1943). The name S m
n comes from the occurrence of an S with subscript n and superscript m in the original formulation of the theorem.
A Post–Turing machine is a "program formulation" of a type of Turing machine, comprising a variant of Emil Post's Turing-equivalent model of computation. Post's model and Turing's model, though very similar to one another, were developed independently. Turing's paper was received for publication in May 1936, followed by Post's in October. A Post–Turing machine uses a binary alphabet, an infinite sequence of binary storage locations, and a primitive programming language with instructions for bi-directional movement among the storage locations and alteration of their contents one at a time. The names "Post–Turing program" and "Post–Turing machine" were used by Martin Davis in 1973–1974. Later in 1980, Davis used the name "Turing–Post program".
In computer science, recursion is a method of solving a computational problem where the solution depends on solutions to smaller instances of the same problem. Recursion solves such recursive problems by using functions that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science.
The power of recursion evidently lies in the possibility of defining an infinite set of objects by a finite statement. In the same manner, an infinite number of computations can be described by a finite recursive program, even if this program contains no explicit repetitions.
Algorithm characterizations are attempts to formalize the word algorithm. Algorithm does not have a generally accepted formal definition. Researchers are actively working on this problem. This article will present some of the "characterizations" of the notion of "algorithm" in more detail.
A counter machine is an abstract machine used in a formal logic and theoretical computer science to model computation. It is the most primitive of the four types of register machines. A counter machine comprises a set of one or more unbounded registers, each of which can hold a single non-negative integer, and a list of arithmetic and control instructions for the machine to follow. The counter machine is typically used in the process of designing parallel algorithms in relation to the mutual exclusion principle. When used in this manner, the counter machine is used to model the discrete time-steps of a computational system in relation to memory accesses. By modeling computations in relation to the memory accesses for each respective computational step, parallel algorithms may be designed in such a matter to avoid interlocking, the simultaneous writing operation by two threads to the same memory address.
Although some authors use the name "register machine" synonymously with "counter machine", this article will give details and examples of only of the most primitive species – the "counter machine" – of the genus "register machine."
The history of the Church–Turing thesis ("thesis") involves the history of the development of the study of the nature of functions whose values are effectively calculable; or, in more modern terms, functions whose values are algorithmically computable. It is an important topic in modern mathematical theory and computer science, particularly associated with the work of Alonzo Church and Alan Turing.
In computability theory, the T predicate, first studied by mathematician Stephen Cole Kleene, is a particular set of triples of natural numbers that is used to represent computable functions within formal theories of arithmetic. Informally, the T predicate tells whether a particular computer program will halt when run with a particular input, and the corresponding U function is used to obtain the results of the computation if the program does halt. As with the smn theorem, the original notation used by Kleene has become standard terminology for the concept.
In mathematical logic, a witness is a specific value t to be substituted for variable x of an existential statement of the form ∃xφ(x) such that φ(t) is true.