In mathematical logic, minimal axioms for Boolean algebra are assumptions which are equivalent to the axioms of Boolean algebra (or propositional calculus), chosen to be as short as possible. For example, an axiom with six NAND operations and three variables is equivalent to Boolean algebra: [1]
where the vertical bar represents the NAND logical operation (also known as the Sheffer stroke).
It is one of 25 candidate axioms for this property identified by Stephen Wolfram, by enumerating the Sheffer identities of length less or equal to 15 elements (excluding mirror images) that have no noncommutative models with four or fewer variables, and was first proven equivalent by William McCune, Branden Fitelson, and Larry Wos. [2] [3] MathWorld, a site associated with Wolfram, has named the axiom the "Wolfram axiom". [4] McCune et al. also found a longer single axiom for Boolean algebra based on disjunction and negation. [3]
In 1933, Edward Vermilye Huntington identified the axiom
as being equivalent to Boolean algebra, when combined with the commutativity of the OR operation, , and the assumption of associativity, . [5] Herbert Robbins conjectured that Huntington's axiom could be replaced by
which requires one fewer use of the logical negation operator . Neither Robbins nor Huntington could prove this conjecture; nor could Alfred Tarski, who took considerable interest in it later. The conjecture was eventually proved in 1996 with the aid of theorem-proving software. [6] [7] [8] This proof established that the Robbins axiom, together with associativity and commutativity, form a 3-basis for Boolean algebra. The existence of a 2-basis was established in 1967 by Carew Arthur Meredith: [9]
The following year, Meredith found a 2-basis in terms of the Sheffer stroke: [10]
In 1973, Padmanabhan and Quackenbush demonstrated a method that, in principle, would yield a 1-basis for Boolean algebra. [11] Applying this method in a straightforward manner yielded "axioms of enormous length", [3] thereby prompting the question of how shorter axioms might be found. This search yielded the 1-basis in terms of the Sheffer stroke given above, as well as the 1-basis
In abstract algebra, a Boolean algebra or Boolean lattice is a complemented distributive lattice. This type of algebraic structure captures essential properties of both set operations and logic operations. A Boolean algebra can be seen as a generalization of a power set algebra or a field of sets, or its elements can be viewed as generalized truth values. It is also a special case of a De Morgan algebra and a Kleene algebra.
In logic, disjunction, also known as logical disjunction or logical or or logical addition or inclusive disjunction, is a logical connective typically notated as and read aloud as "or". For instance, the English language sentence "it is sunny or it is warm" can be represented in logic using the disjunctive formula , assuming that abbreviates "it is sunny" and abbreviates "it is warm".
First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.
Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.
In Boolean functions and propositional calculus, the Sheffer stroke denotes a logical operation that is equivalent to the negation of the conjunction operation, expressed in ordinary language as "not both". It is also called non-conjunction, or alternative denial since it says in effect that at least one of its operands is false, or NAND. In digital electronics, it corresponds to the NAND gate. It is named after Henry Maurice Sheffer and written as or as or as or as in Polish notation by Łukasiewicz.
In propositional logic and Boolean algebra, De Morgan's laws, also known as De Morgan's theorem, are a pair of transformation rules that are both valid rules of inference. They are named after Augustus De Morgan, a 19th-century British mathematician. The rules allow the expression of conjunctions and disjunctions purely in terms of each other via negation.
In boolean logic, a disjunctive normal form (DNF) is a canonical normal form of a logical formula consisting of a disjunction of conjunctions; it can also be described as an OR of ANDs, a sum of products, or a cluster concept. As a normal form, it is useful in automated theorem proving.
In mathematics, the distributive property of binary operations is a generalization of the distributive law, which asserts that the equality
Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not assume the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.
In Boolean logic, logical NOR or non-disjunction or joint denial is a truth-functional operator which produces a result that is the negation of logical or. That is, a sentence of the form (p NOR q) is true precisely when neither p nor q is true—i.e. when both of p and q are false. It is logically equivalent to and , where the symbol signifies logical negation, signifies OR, and signifies AND.
In mathematics, constructive analysis is mathematical analysis done according to some principles of constructive mathematics.
In mathematics, a Heyting algebra (also known as pseudo-Boolean algebra) is a bounded lattice (with join and meet operations written ∨ and ∧ and with least element 0 and greatest element 1) equipped with a binary operation a → b of implication such that (c ∧ a) ≤ b is equivalent to c ≤ (a → b). From a logical standpoint, A → B is by this definition the weakest proposition for which modus ponens, the inference rule A → B, A ⊢ B, is sound. Like Boolean algebras, Heyting algebras form a variety axiomatizable with finitely many equations. Heyting algebras were introduced by Arend Heyting (1930) to formalize intuitionistic logic.
In abstract algebra, a branch of pure mathematics, an MV-algebra is an algebraic structure with a binary operation , a unary operation , and the constant , satisfying certain axioms. MV-algebras are the algebraic semantics of Łukasiewicz logic; the letters MV refer to the many-valued logic of Łukasiewicz. MV-algebras coincide with the class of bounded commutative BCK algebras.
Carew Arthur Meredith, usually cited as C. A. Meredith, was an influential Irish logician, who worked in Trinity College, Dublin from 1943 to 1964. His work on condensed detachment is influential in modern research.
In logic, a functionally complete set of logical connectives or Boolean operators is one which can be used to express all possible truth tables by combining members of the set into a Boolean expression. A well-known complete set of connectives is { AND, NOT }. Each of the singleton sets { NAND } and { NOR } is functionally complete. However, the set { AND, OR } is incomplete, due to its inability to express NOT.
In abstract algebra, a Robbins algebra is an algebra containing a single binary operation, usually denoted by , and a single unary operation usually denoted by satisfying the following axioms:
Boolean algebra is a mathematically rich branch of abstract algebra. Stanford Encyclopaedia of Philosophy defines Boolean algebra as 'the algebra of two-valued logic with only sentential connectives, or equivalently of algebras of sets under union and complementation.' Just as group theory deals with groups, and linear algebra with vector spaces, Boolean algebras are models of the equational theory of the two values 0 and 1. Common to Boolean algebras, groups, and vector spaces is the notion of an algebraic structure, a set closed under some operations satisfying certain equations.
Łukasiewicz–Moisil algebras were introduced in the 1940s by Grigore Moisil in the hope of giving algebraic semantics for the n-valued Łukasiewicz logic. However, in 1956 Alan Rose discovered that for n ≥ 5, the Łukasiewicz–Moisil algebra does not model the Łukasiewicz logic. A faithful model for the ℵ0-valued (infinitely-many-valued) Łukasiewicz–Tarski logic was provided by C. C. Chang's MV-algebra, introduced in 1958. For the axiomatically more complicated (finite) n-valued Łukasiewicz logics, suitable algebras were published in 1977 by Revaz Grigolia and called MVn-algebras. MVn-algebras are a subclass of LMn-algebras, and the inclusion is strict for n ≥ 5. In 1982 Roberto Cignoli published some additional constraints that added to LMn-algebras produce proper models for n-valued Łukasiewicz logic; Cignoli called his discovery proper Łukasiewicz algebras.
In mathematics and mathematical logic, Boolean algebra is a branch of algebra. It differs from elementary algebra in two ways. First, the values of the variables are the truth values true and false, usually denoted 1 and 0, whereas in elementary algebra the values of the variables are numbers. Second, Boolean algebra uses logical operators such as conjunction (and) denoted as ∧, disjunction (or) denoted as ∨, and the negation (not) denoted as ¬. Elementary algebra, on the other hand, uses arithmetic operators such as addition, multiplication, subtraction, and division. Boolean algebra is therefore a formal way of describing logical operations, in the same way that elementary algebra describes numerical operations.