Epsilon calculus

Last updated

In logic, Hilbert's epsilon calculus is an extension of a formal language by the epsilon operator, where the epsilon operator substitutes for quantifiers in that language as a method leading to a proof of consistency for the extended formal language. The epsilon operator and epsilon substitution method are typically applied to a first-order predicate calculus, followed by a demonstration of consistency. The epsilon-extended calculus is further extended and generalized to cover those mathematical objects, classes, and categories for which there is a desire to show consistency, building on previously-shown consistency at earlier levels. [1]

Contents

Epsilon operator

Hilbert notation

For any formal language L, extend L by adding the epsilon operator to redefine quantification:

The intended interpretation of ϵxA is some x that satisfies A, if it exists. In other words, ϵxA returns some term t such that A(t) is true, otherwise it returns some default or arbitrary term. If more than one term can satisfy A, then any one of these terms (which make A true) can be chosen, non-deterministically. Equality is required to be defined under L, and the only rules required for L extended by the epsilon operator are modus ponens and the substitution of A(t) to replace A(x) for any term t. [2]

Bourbaki notation

In tau-square notation from N. Bourbaki's Theory of Sets, the quantifiers are defined as follows:

where A is a relation in L, x is a variable, and juxtaposes a at the front of A, replaces all instances of x with , and links them back to . Then let Y be an assembly, (Y|x)A denotes the replacement of all variables x in A with Y.

This notation is equivalent to the Hilbert notation and is read the same. It is used by Bourbaki to define cardinal assignment since they do not use the axiom of replacement.

Defining quantifiers in this way leads to great inefficiencies. For instance, the expansion of Bourbaki's original definition of the number one, using this notation, has length approximately 4.5 × 1012, and for a later edition of Bourbaki that combined this notation with the Kuratowski definition of ordered pairs, this number grows to approximately 2.4 × 1054. [3]

Modern approaches

Hilbert's program for mathematics was to justify those formal systems as consistent in relation to constructive or semi-constructive systems. While Gödel's results on incompleteness mooted Hilbert's Program to a great extent, modern researchers find the epsilon calculus to provide alternatives for approaching proofs of systemic consistency as described in the epsilon substitution method.

Epsilon substitution method

A theory to be checked for consistency is first embedded in an appropriate epsilon calculus. Second, a process is developed for re-writing quantified theorems to be expressed in terms of epsilon operations via the epsilon substitution method. Finally, the process must be shown to normalize the re-writing process, so that the re-written theorems satisfy the axioms of the theory. [4]

Notes

  1. Stanford, overview section
  2. Stanford, the epsilon calculus section
  3. Mathias, A. R. D. (2002), "A term of length 4 523 659 424 929" (PDF), Synthese, 133 (1–2): 75–86, doi:10.1023/A:1020827725055, MR   1950044 .
  4. Stanford, more recent developments section

Related Research Articles

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

<span class="mw-page-title-main">Original proof of Gödel's completeness theorem</span>

The proof of Gödel's completeness theorem given by Kurt Gödel in his doctoral dissertation of 1929 is not easy to read today; it uses concepts and formalisms that are no longer used and terminology that is often obscure. The version given below attempts to represent all the steps in the proof and all the important ideas faithfully, while restating the proof in the modern language of mathematical logic. This outline should not be considered a rigorous proof of the theorem.

<span class="mw-page-title-main">Logical connective</span> Symbol connecting sentential formulas in logic

In logic, a logical connective is a logical constant. Connectives can be used to connect logical formulas. For instance in the syntax of propositional logic, the binary connective can be used to join the two atomic formulas and , rendering the complex formula .

In mathematics and theoretical computer science, a type theory is the formal presentation of a specific type system. Type theory is the academic study of type systems.

In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use axioms as much as possible to express the logical laws of deductive reasoning.

<span class="mw-page-title-main">De Morgan's laws</span> Pair of logical equivalences

In propositional logic and Boolean algebra, De Morgan's laws, also known as De Morgan's theorem, are a pair of transformation rules that are both valid rules of inference. They are named after Augustus De Morgan, a 19th-century British mathematician. The rules allow the expression of conjunctions and disjunctions purely in terms of each other via negation.

<span class="mw-page-title-main">Negation</span> Logical operation

In logic, negation, also called the logical not or logical complement, is an operation that takes a proposition to another proposition "not ", standing for " is not true", written , or . It is interpreted intuitively as being true when is false, and false when is true. Negation is thus a unary logical connective. It may be applied as an operation on notions, propositions, truth values, or semantic values more generally. In classical logic, negation is normally identified with the truth function that takes truth to falsity. In intuitionistic logic, according to the Brouwer–Heyting–Kolmogorov interpretation, the negation of a proposition is the proposition whose proofs are the refutations of .

Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not assume the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.

In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the natural style of deduction used by mathematicians than to David Hilbert's earlier style of formal logic, in which every line was an unconditional tautology. More subtle distinctions may exist; for example, propositions may implicitly depend upon non-logical axioms. In that case, sequents signify conditional theorems in a first-order language rather than conditional tautologies.

In logic, temporal logic is any system of rules and symbolism for representing, and reasoning about, propositions qualified in terms of time. It is sometimes also used to refer to tense logic, a modal logic-based system of temporal logic introduced by Arthur Prior in the late 1950s, with important contributions by Hans Kamp. It has been further developed by computer scientists, notably Amir Pnueli, and logicians.

In mathematical logic, propositional logic and predicate logic, a well-formed formula, abbreviated WFF or wff, often simply formula, is a finite sequence of symbols from a given alphabet that is part of a formal language. A formal language can be identified with the set of formulas in the language.

In mathematical logic and computer science, the calculus of constructions (CoC) is a type theory created by Thierry Coquand. It can serve as both a typed programming language and as constructive foundation for mathematics. For this second reason, the CoC and its variants have been the basis for Coq and other proof assistants.

An infinitary logic is a logic that allows infinitely long statements and/or infinitely long proofs. The concept was introduced by Zermelo in the 1930s.

<span class="mw-page-title-main">Material conditional</span> Logical connective

The material conditional is an operation commonly used in logic. When the conditional symbol is interpreted as material implication, a formula is true unless is true and is false. Material implication can also be characterized inferentially by modus ponens, modus tollens, conditional proof, and classical reductio ad absurdum.

A choice function (selector, selection) is a mathematical function f that is defined on some collection X of nonempty sets and assigns some element of each set S in that collection to S by f(S); f(S) maps S to some element of S. In other words, f is a choice function for X if and only if it belongs to the direct product of X.

The simply typed lambda calculus, a form of type theory, is a typed interpretation of the lambda calculus with only one type constructor that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical use of the untyped lambda calculus.

In mathematical logic, Heyting arithmetic is an axiomatization of arithmetic in accordance with the philosophy of intuitionism. It is named after Arend Heyting, who first proposed it.

In logic, especially mathematical logic, a Hilbert system, sometimes called Hilbert calculus, Hilbert-style deductive system or Hilbert–Ackermann system, is a type of system of formal deduction attributed to Gottlob Frege and David Hilbert. These deductive systems are most often studied for first-order logic, but are of interest for other logics as well.

A Hindley–Milner (HM) type system is a classical type system for the lambda calculus with parametric polymorphism. It is also known as Damas–Milner or Damas–Hindley–Milner. It was first described by J. Roger Hindley and later rediscovered by Robin Milner. Luis Damas contributed a close formal analysis and proof of the method in his PhD thesis.

In logic, a quantifier is an operator that specifies how many individuals in the domain of discourse satisfy an open formula. For instance, the universal quantifier in the first order formula expresses that everything in the domain satisfies the property denoted by . On the other hand, the existential quantifier in the formula expresses that there exists something in the domain which satisfies that property. A formula where a quantifier takes widest scope is called a quantified formula. A quantified formula must contain a bound variable and a subformula specifying a property of the referent of that variable.

References