Laws of Form (hereinafter LoF) is a book by G. Spencer-Brown, published in 1969, that straddles the boundary between mathematics and philosophy. LoF describes three distinct logical systems:
"Boundary algebra" is a Meguire (2011) term for the union of the primary algebra and the primary arithmetic. Laws of Form sometimes loosely refers to the "primary algebra" as well as to LoF.
The preface states that the work was first explored in 1959, and Spencer Brown cites Bertrand Russell as being supportive of his endeavour. He also thanks J. C. P. Miller of University College London for helping with the proof reading and offering other guidance. In 1963 Spencer Brown was invited by Harry Frost, staff lecturer in the physical sciences at the department of Extra-Mural Studies of the University of London, to deliver a course on the mathematics of logic.
LoF emerged from work in electronic engineering its author did around 1960. Key ideas of the LOF were first outlined in his 1961 manuscript Design with the Nor, which remained unpublished until 2021, [1] and further refined during subsequent lectures on mathematical logic he gave under the auspices of the University of London's extension program. LoF has appeared in several editions. The second series of editions appeared in 1972 with the "Preface to the First American Edition", which emphasised the use of self-referential paradoxes, [2] and the most recent being a 1997 German translation. LoF has never gone out of print.
LoF's mystical and declamatory prose and its love of paradox make it a challenging read for all. Spencer-Brown was influenced by Wittgenstein and R. D. Laing. LoF also echoes a number of themes from the writings of Charles Sanders Peirce, Bertrand Russell, and Alfred North Whitehead.
The work has had curious effects on some classes of its readership; for example, on obscure grounds, it has been claimed that the entire book is written in an operational way, giving instructions to the reader instead of telling them what "is", and that in accordance with G. Spencer-Brown's interest in paradoxes, the only sentence that makes a statement that something is, is the statement which says no such statements are used in this book. [3] Furthermore, the claim asserts that except for this one sentence the book can be seen as an example of E-Prime. What prompted such a claim, is obscure, either in terms of incentive, logical merit, or as a matter of fact, because the book routinely and naturally uses the verb to be throughout, and in all its grammatical forms, as may be seen both in the original and in quotes shown below. [4]
Ostensibly a work of formal mathematics and philosophy, LoF became something of a cult classic: it was praised by Heinz von Foerster when he reviewed it for the Whole Earth Catalog . [5] Those who agree point to LoF as embodying an enigmatic "mathematics of consciousness", its algebraic symbolism capturing an (perhaps even "the") implicit root of cognition: the ability to "distinguish". LoF argues that primary algebra reveals striking connections among logic, Boolean algebra, and arithmetic, and the philosophy of language and mind.
Stafford Beer wrote in a review for Nature , "When one thinks of all that Russell went through sixty years ago, to write the Principia , and all we his readers underwent in wrestling with those three vast volumes, it is almost sad". [6]
Banaschewski (1977) [7] argues that the primary algebra is nothing but new notation for Boolean algebra. Indeed, the two-element Boolean algebra 2 can be seen as the intended interpretation of the primary algebra. Yet the notation of the primary algebra:
Moreover, the syntax of the primary algebra can be extended to formal systems other than 2 and sentential logic, resulting in boundary mathematics (see § Related work below).
LoF has influenced, among others, Heinz von Foerster, Louis Kauffman, Niklas Luhmann, Humberto Maturana, Francisco Varela and William Bricken. Some of these authors have modified the primary algebra in a variety of interesting ways.
LoF claimed that certain well-known mathematical conjectures of very long standing, such as the four color theorem, Fermat's Last Theorem, and the Goldbach conjecture, are provable using extensions of the primary algebra. Spencer-Brown eventually circulated a purported proof of the four color theorem, but it met with skepticism. [8]
The symbol:
Also called the "mark" or "cross", is the essential feature of the Laws of Form. In Spencer-Brown's inimitable and enigmatic fashion, the Mark symbolizes the root of cognition, i.e., the dualistic Mark indicates the capability of differentiating a "this" from "everything else but this".
In LoF, a Cross denotes the drawing of a "distinction", and can be thought of as signifying the following, all at once:
All three ways imply an action on the part of the cognitive entity (e.g., person) making the distinction. As LoF puts it:
"The first command:
- Draw a distinction
can well be expressed in such ways as:
- Let there be a distinction,
- Find a distinction,
- See a distinction,
- Describe a distinction,
- Define a distinction,
Or:
- Let a distinction be drawn". (LoF, Notes to chapter 2)
The counterpoint to the Marked state is the Unmarked state, which is simply nothing, the void, or the un-expressable infinite represented by a blank space. It is simply the absence of a Cross. No distinction has been made and nothing has been crossed. The Marked state and the void are the two primitive values of the Laws of Form.
The Cross can be seen as denoting the distinction between two states, one "considered as a symbol" and another not so considered. From this fact arises a curious resonance with some theories of consciousness and language. Paradoxically, the Form is at once Observer and Observed, and is also the creative act of making an observation. LoF (excluding back matter) closes with the words:
...the first distinction, the Mark and the observer are not only interchangeable, but, in the form, identical.
C. S. Peirce came to a related insight in the 1890s; see § Related work.
The syntax of the primary arithmetic goes as follows. There are just two atomic expressions:
There are two inductive rules:
The semantics of the primary arithmetic are perhaps nothing more than the sole explicit definition in LoF: "Distinction is perfect continence".
Let the "unmarked state" be a synonym for the void. Let an empty Cross denote the "marked state". To cross is to move from one value, the unmarked or marked state, to the other. We can now state the "arithmetical" axioms A1 and A2, which ground the primary arithmetic (and hence all of the Laws of Form):
"A1. The law of Calling". Calling twice from a state is indistinguishable from calling once. To make a distinction twice has the same effect as making it once. For example, saying "Let there be light" and then saying "Let there be light" again, is the same as saying it once. Formally:
"A2. The law of Crossing". After crossing from the unmarked to the marked state, crossing again ("recrossing") starting from the marked state returns one to the unmarked state. Hence recrossing annuls crossing. Formally:
In both A1 and A2, the expression to the right of '=' has fewer symbols than the expression to the left of '='. This suggests that every primary arithmetic expression can, by repeated application of A1 and A2, be simplified to one of two states: the marked or the unmarked state. This is indeed the case, and the result is the expression's "simplification". The two fundamental metatheorems of the primary arithmetic state that:
Thus the relation of logical equivalence partitions all primary arithmetic expressions into two equivalence classes: those that simplify to the Cross, and those that simplify to the void.
A1 and A2 have loose analogs in the properties of series and parallel electrical circuits, and in other ways of diagramming processes, including flowcharting. A1 corresponds to a parallel connection and A2 to a series connection, with the understanding that making a distinction corresponds to changing how two points in a circuit are connected, and not simply to adding wiring.
The primary arithmetic is analogous to the following formal languages from mathematics and computer science:
The phrase "calculus of indications" in LoF is a synonym for "primary arithmetic".
A concept peculiar to LoF is that of "canon". While LoF does not formally define canon, the following two excerpts from the Notes to chpt. 2 are apt:
The more important structures of command are sometimes called canons. They are the ways in which the guiding injunctions appear to group themselves in constellations, and are thus by no means independent of each other. A canon bears the distinction of being outside (i.e., describing) the system under construction, but a command to construct (e.g., 'draw a distinction'), even though it may be of central importance, is not a canon. A canon is an order, or set of orders, to permit or allow, but not to construct or create.
...the primary form of mathematical communication is not description but injunction... Music is a similar art form, the composer does not even attempt to describe the set of sounds he has in mind, much less the set of feelings occasioned through them, but writes down a set of commands which, if they are obeyed by the performer, can result in a reproduction, to the listener, of the composer's original experience.
These excerpts relate to the distinction in metalogic between the object language, the formal language of the logical system under discussion, and the metalanguage, a language (often a natural language) distinct from the object language, employed to exposit and discuss the object language. The first quote seems to assert that the canons are part of the metalanguage. The second quote seems to assert that statements in the object language are essentially commands addressed to the reader by the author. Neither assertion holds in standard metalogic.
Given any valid primary arithmetic expression, insert into one or more locations any number of Latin letters bearing optional numerical subscripts; the result is a primary algebra formula. Letters so employed in mathematics and logic are called variables. A primary algebra variable indicates a location where one can write the primitive value or its complement . Multiple instances of the same variable denote multiple locations of the same primitive value.
The sign '=' may link two logically equivalent expressions; the result is an equation. By "logically equivalent" is meant that the two expressions have the same simplification. Logical equivalence is an equivalence relation over the set of primary algebra formulas, governed by the rules R1 and R2. Let "C" and "D" be formulae each containing at least one instance of the subformula A:
R2 is employed very frequently in primary algebra demonstrations (see below), almost always silently. These rules are routinely invoked in logic and most of mathematics, nearly always unconsciously.
The primary algebra consists of equations, i.e., pairs of formulae linked by an infix operator '='. R1 and R2 enable transforming one equation into another. Hence the primary algebra is an equational formal system, like the many algebraic structures, including Boolean algebra, that are varieties. Equational logic was common before Principia Mathematica (e.g. Johnson (1892)), and has present-day advocates (Gries & Schneider (1993)).
Conventional mathematical logic consists of tautological formulae, signalled by a prefixed turnstile. To denote that the primary algebra formula A is a tautology, simply write "A = ". If one replaces '=' in R1 and R2 with the biconditional, the resulting rules hold in conventional logic. However, conventional logic relies mainly on the rule modus ponens; thus conventional logic is ponential. The equational-ponential dichotomy distills much of what distinguishes mathematical logic from the rest of mathematics.
An initial is a primary algebra equation verifiable by a decision procedure and as such is not an axiom. LoF lays down the initials:
|
| = . |
The absence of anything to the right of the "=" above, is deliberate.
|
| C | = |
| . |
J2 is the familiar distributive law of sentential logic and Boolean algebra.
Another set of initials, friendlier to calculations, is:
| A | = | A. |
|
| = | . |
| A |
| = | A |
| . |
It is thanks to C2 that the primary algebra is a lattice. By virtue of J1a, it is a complemented lattice whose upper bound is . By J0, is the corresponding lower bound and identity element. J0 is also an algebraic version of A2 and makes clear the sense in which aliases with the blank page.
T13 in LoF generalizes C2 as follows. Any primary algebra (or sentential logic) formula B can be viewed as an ordered tree with branches. Then:
T13: A subformula A can be copied at will into any depth of B greater than that of A, as long as A and its copy are in the same branch of B. Also, given multiple instances of A in the same branch of B, all instances but the shallowest are redundant.
While a proof of T13 would require induction, the intuition underlying it should be clear.
C2 or its equivalent is named:
Perhaps the first instance of an axiom or rule with the power of C2 was the "Rule of (De)Iteration", combining T13 and AA=A, of C. S. Peirce's existential graphs.
LoF asserts that concatenation can be read as commuting and associating by default and hence need not be explicitly assumed or demonstrated. (Peirce made a similar assertion about his existential graphs.) Let a period be a temporary notation to establish grouping. That concatenation commutes and associates may then be demonstrated from the:
Having demonstrated associativity, the period can be discarded.
The initials in Meguire (2011) are AC.D=CD.A, called B1; B2, J0 above; B3, J1a above; and B4, C2. By design, these initials are very similar to the axioms for an abelian group, G1-G3 below.
The primary algebra contains three kinds of proved assertions:
The distinction between consequence and theorem holds for all formal systems, including mathematics and logic, but is usually not made explicit. A demonstration or decision procedure can be carried out and verified by computer. The proof of a theorem cannot be.
Let A and B be primary algebra formulas. A demonstration of A=B may proceed in either of two ways:
Once A=B has been demonstrated, A=B can be invoked to justify steps in subsequent demonstrations. primary algebra demonstrations and calculations often require no more than J1a, J2, C2, and the consequences (C3 in LoF), (C1), and AA=A (C5).
The consequence , C7' in LoF, enables an algorithm, sketched in LoFs proof of T14, that transforms an arbitrary primary algebra formula to an equivalent formula whose depth does not exceed two. The result is a normal form, the primary algebra analog of the conjunctive normal form. LoF (T14–15) proves the primary algebra analog of the well-known Boolean algebra theorem that every formula has a normal form.
Let A be a subformula of some formula B. When paired with C3, J1a can be viewed as the closure condition for calculations: B is a tautology if and only if A and (A) both appear in depth 0 of B. A related condition appears in some versions of natural deduction. A demonstration by calculation is often little more than:
The last step of a calculation always invokes J1a.
LoF includes elegant new proofs of the following standard metatheory:
That sentential logic is complete is taught in every first university course in mathematical logic. But university courses in Boolean algebra seldom mention the completeness of 2.
If the Marked and Unmarked states are read as the Boolean values 1 and 0 (or True and False), the primary algebra interprets 2 (or sentential logic). LoF shows how the primary algebra can interpret the syllogism. Each of these interpretations is discussed in a subsection below. Extending the primary algebra so that it could interpret standard first-order logic has yet to be done, but Peirce's beta existential graphs suggest that this extension is feasible.
The primary algebra is an elegant minimalist notation for the two-element Boolean algebra 2. Let:
If join (meet) interprets AC, then meet (join) interprets . Hence the primary algebra and 2 are isomorphic but for one detail: primary algebra complementation can be nullary, in which case it denotes a primitive value. Modulo this detail, 2 is a model of the primary algebra. The primary arithmetic suggests the following arithmetic axiomatization of 2: 1+1=1+0=0+1=1=~0, and 0+0=0=~1.
The set is the Boolean domain or carrier. In the language of universal algebra, the primary algebra is the algebraic structure of type . The expressive adequacy of the Sheffer stroke points to the primary algebra also being a algebra of type . In both cases, the identities are J1a, J0, C2, and ACD=CDA. Since the primary algebra and 2 are isomorphic, 2 can be seen as a algebra of type . This description of 2 is simpler than the conventional one, namely an algebra of type .
The two possible interpretations are dual to each other in the Boolean sense. (In Boolean algebra, exchanging AND ↔ OR and 1 ↔ 0 throughout an equation yields an equally valid equation.) The identities remain invariant regardless of which interpretation is chosen, so the transformations or modes of calculation remain the same; only the interpretation of each form would be different. Example: J1a is . Interpreting juxtaposition as OR and as 1, this translates to which is true. Interpreting juxtaposition as AND and as 0, this translates to which is true as well (and the dual of ).
The marked state, , is both an operator (e.g., the complement) and operand (e.g., the value 1). This can be summarized neatly by defining two functions and for the marked and unmarked state, respectively: let and , where is a (possibly empty) set of boolean values.
This reveals that is either the value 0 or the OR operator, while is either the value 1 or the NOR operator, depending on whether is the empty set or not. As noted above, there is a dual form of these functions exchanging AND ↔ OR and 1 ↔ 0.
Let the blank page denote False, and let a Cross be read as Not. Then the primary arithmetic has the following sentential reading:
The primary algebra interprets sentential logic as follows. A letter represents any given sentential expression. Thus:
| , |
| both interpret A if and only if B or A is equivalent to B. |
Thus any expression in sentential logic has a primary algebra translation. Equivalently, the primary algebra interprets sentential logic. Given an assignment of every variable to the Marked or Unmarked states, this primary algebra translation reduces to a primary arithmetic expression, which can be simplified. Repeating this exercise for all possible assignments of the two primitive values to each variable, reveals whether the original expression is tautological or satisfiable. This is an example of a decision procedure, one more or less in the spirit of conventional truth tables. Given some primary algebra formula containing N variables, this decision procedure requires simplifying 2N primary arithmetic formulae. For a less tedious decision procedure more in the spirit of Quine's "truth value analysis", see Meguire (2003).
Schwartz (1981) proved that the primary algebra is equivalent — syntactically, semantically, and proof theoretically — with the classical propositional calculus. Likewise, it can be shown that the primary algebra is syntactically equivalent with expressions built up in the usual way from the classical truth values true and false, the logical connectives NOT, OR, and AND, and parentheses.
Interpreting the Unmarked State as False is wholly arbitrary; that state can equally well be read as True. All that is required is that the interpretation of concatenation change from OR to AND. IF A THEN B now translates as instead of . More generally, the primary algebra is "self-dual", meaning that any primary algebra formula has two sentential or Boolean readings, each the dual of the other. Another consequence of self-duality is the irrelevance of De Morgan's laws; those laws are built into the syntax of the primary algebra from the outset.
The true nature of the distinction between the primary algebra on the one hand, and 2 and sentential logic on the other, now emerges. In the latter formalisms, complementation/negation operating on "nothing" is not well-formed. But an empty Cross is a well-formed primary algebra expression, denoting the Marked state, a primitive value. Hence a nonempty Cross is an operator, while an empty Cross is an operand because it denotes a primitive value. Thus the primary algebra reveals that the heretofore distinct mathematical concepts of operator and operand are in fact merely different facets of a single fundamental action, the making of a distinction.
Appendix 2 of LoF shows how to translate traditional syllogisms and sorites into the primary algebra. A valid syllogism is simply one whose primary algebra translation simplifies to an empty Cross. Let A* denote a literal, i.e., either A or , indifferently. Then every syllogism that does not require that one or more terms be assumed nonempty is one of 24 possible permutations of a generalization of Barbara whose primary algebra equivalent is . These 24 possible permutations include the 19 syllogistic forms deemed valid in Aristotelian and medieval logic. This primary algebra translation of syllogistic logic also suggests that the primary algebra can interpret monadic and term logic, and that the primary algebra has affinities to the Boolean term schemata of Quine (1982) , Part II.
The following calculation of Leibniz's nontrivial Praeclarum Theorema exemplifies the demonstrative power of the primary algebra. Let C1 be =A, C2 be , C3 be , J1a be , and let OI mean that variables and subformulae have been reordered in a way that commutativity and associativity permit.
[(P→R)∧(Q→S)]→[(P∧Q)→(R∧S)]. | Praeclarum Theorema. | ||||||||||||||||||||||
| primary algebra translation | ||||||||||||||||||||||
| C1. | ||||||||||||||||||||||
| C1. | ||||||||||||||||||||||
| OI. | ||||||||||||||||||||||
| C2. | ||||||||||||||||||||||
| OI. | ||||||||||||||||||||||
| C2. | ||||||||||||||||||||||
| OI. | ||||||||||||||||||||||
| C2. | ||||||||||||||||||||||
| C1. | ||||||||||||||||||||||
| OI. | ||||||||||||||||||||||
| J1a. | ||||||||||||||||||||||
| OI. | ||||||||||||||||||||||
| C3. |
The primary algebra embodies a point noted by Huntington in 1933: Boolean algebra requires, in addition to one unary operation, one, and not two, binary operations. Hence the seldom-noted fact that Boolean algebras are magmas. (Magmas were called groupoids until the latter term was appropriated by category theory.) To see this, note that the primary algebra is a commutative:
Groups also require a unary operation, called inverse, the group counterpart of Boolean complementation. Let denote the inverse of a. Let denote the group identity element. Then groups and the primary algebra have the same signatures, namely they are both algebras of type 〈2,1,0〉. Hence the primary algebra is a boundary algebra. The axioms for an abelian group, in boundary notation, are:
From G1 and G2, the commutativity and associativity of concatenation may be derived, as above. Note that G3 and J1a are identical. G2 and J0 would be identical if = replaced A2. This is the defining arithmetical identity of group theory, in boundary notation.
The primary algebra differs from an abelian group in two ways:
Both A2 and C2 follow from B's being an ordered set.
Chapter 11 of LoF introduces equations of the second degree, composed of recursive formulae that can be seen as having "infinite" depth. Some recursive formulae simplify to the marked or unmarked state. Others "oscillate" indefinitely between the two states depending on whether a given depth is even or odd. Specifically, certain recursive formulae can be interpreted as oscillating between true and false over successive intervals of time, in which case a formula is deemed to have an "imaginary" truth value. Thus the flow of time may be introduced into the primary algebra.
Turney (1986) shows how these recursive formulae can be interpreted via Alonzo Church's Restricted Recursive Arithmetic (RRA). Church introduced RRA in 1955 as an axiomatic formalization of finite automata. Turney presents a general method for translating equations of the second degree into Church's RRA, illustrating his method using the formulae E1, E2, and E4 in chapter 11 of LoF. This translation into RRA sheds light on the names Spencer-Brown gave to E1 and E4, namely "memory" and "counter". RRA thus formalizes and clarifies LoF's notion of an imaginary truth value.
Gottfried Leibniz, in memoranda not published before the late 19th and early 20th centuries, invented Boolean logic. His notation was isomorphic to that of LoF: concatenation read as conjunction, and "non-(X)" read as the complement of X. Recognition of Leibniz's pioneering role in algebraic logic was foreshadowed Lewis (1918) and Rescher (1954). But a full appreciation of Leibniz's accomplishments had to await the work of Wolfgang Lenzen, published in the 1980s and reviewed in Lenzen (2004).
Charles Sanders Peirce (1839–1914) anticipated the primary algebra in three veins of work:
LoF cites vol. 4 of Peirce's Collected Papers, the source for the formalisms in (2) and (3) above. (1)-(3) were virtually unknown at the time when (1960s) and in the place where (UK) LoF was written. Peirce's semiotics, about which LoF is silent, may yet shed light on the philosophical aspects of LoF.
Kauffman (2001) discusses another notation similar to that of LoF, that of a 1917 article by Jean Nicod, who was a disciple of Bertrand Russell's.
The above formalisms are, like the primary algebra, all instances of boundary mathematics, i.e., mathematics whose syntax is limited to letters and brackets (enclosing devices). A minimalist syntax of this nature is a "boundary notation". Boundary notation is free of infix operators, prefix, or postfix operator symbols. The very well known curly braces ('{', '}') of set theory can be seen as a boundary notation.
The work of Leibniz, Peirce, and Nicod is innocent of metatheory, as they wrote before Emil Post's landmark 1920 paper (which LoF cites), proving that sentential logic is complete, and before Hilbert and Łukasiewicz showed how to prove axiom independence using models.
Craig (1979) argued that the world, and how humans perceive and interact with that world, has a rich Boolean structure. Craig was an orthodox logician and an authority on algebraic logic.
Second-generation cognitive science emerged in the 1970s, after LoF was written. On cognitive science and its relevance to Boolean algebra, logic, and set theory, see Lakoff (1987) (see index entries under "Image schema examples: container") and Lakoff & Núñez (2000). Neither book cites LoF.
The biologists and cognitive scientists Humberto Maturana and his student Francisco Varela both discuss LoF in their writings, which identify "distinction" as the fundamental cognitive act. The Berkeley psychologist and cognitive scientist Eleanor Rosch has written extensively on the closely related notion of categorization.
Other formal systems with possible affinities to the primary algebra include:
The primary arithmetic and algebra are a minimalist formalism for sentential logic and Boolean algebra. Other minimalist formalisms having the power of set theory include:
This article's use of external links may not follow Wikipedia's policies or guidelines.(November 2024) |
In abstract algebra, a Boolean algebra or Boolean lattice is a complemented distributive lattice. This type of algebraic structure captures essential properties of both set operations and logic operations. A Boolean algebra can be seen as a generalization of a power set algebra or a field of sets, or its elements can be viewed as generalized truth values. It is also a special case of a De Morgan algebra and a Kleene algebra.
In logic, a logical connective is a logical constant. Connectives can be used to connect logical formulas. For instance in the syntax of propositional logic, the binary connective can be used to join the two atomic formulas and , rendering the complex formula .
In Boolean functions and propositional calculus, the Sheffer stroke denotes a logical operation that is equivalent to the negation of the conjunction operation, expressed in ordinary language as "not both". It is also called non-conjunction, or alternative denial, or NAND. In digital electronics, it corresponds to the NAND gate. It is named after Henry Maurice Sheffer and written as or as or as or as in Polish notation by Łukasiewicz.
In mathematics, equality is a relationship between two quantities or expressions, stating that they have the same value, or represent the same mathematical object. Equality between A and B is written A = B, and pronounced "A equals B". In this equality, A and B are distinguished by calling them left-hand side (LHS), and right-hand side (RHS). Two objects that are not equal are said to be distinct.
Exclusive or, exclusive disjunction, exclusive alternation, logical non-equivalence, or logical inequality is a logical operator whose negation is the logical biconditional. With two inputs, XOR is true if and only if the inputs differ. With multiple inputs, XOR is true if and only if the number of true inputs is odd.
In Boolean logic, logical NOR, non-disjunction, or joint denial is a truth-functional operator which produces a result that is the negation of logical or. That is, a sentence of the form (p NOR q) is true precisely when neither p nor q is true—i.e. when both p and q are false. It is logically equivalent to and , where the symbol signifies logical negation, signifies OR, and signifies AND.
In mathematics, a Heyting algebra (also known as pseudo-Boolean algebra) is a bounded lattice (with join and meet operations written ∨ and ∧ and with least element 0 and greatest element 1) equipped with a binary operation a → b called implication such that (c ∧ a) ≤ b is equivalent to c ≤ (a → b). From a logical standpoint, A → B is by this definition the weakest proposition for which modus ponens, the inference rule A → B, A ⊢ B, is sound. Like Boolean algebras, Heyting algebras form a variety axiomatizable with finitely many equations. Heyting algebras were introduced by Arend Heyting (1930) to formalize intuitionistic logic.
In mathematics, an expression is a written arrangement of symbols following the context-dependent, syntactic conventions of mathematical notation. Symbols can denote numbers, variables, operations, and functions. Other symbols include punctuation marks and brackets, used for grouping where there is not a well-defined order of operations.
A vinculum is a horizontal line used in mathematical notation for various purposes. It may be placed as an overline or underline above or below a mathematical expression to group the expression's elements. Historically, vincula were extensively used to group items together, especially in written mathematics, but in modern mathematics its use for this purpose has almost entirely been replaced by the use of parentheses. It was also used to mark Roman numerals whose values are multiplied by 1,000. Today, however, the common usage of a vinculum to indicate the repetend of a repeating decimal is a significant exception and reflects the original usage.
Arithmetica is an Ancient Greek text on mathematics written by the mathematician Diophantus in the 3rd century AD. It is a collection of 130 algebraic problems giving numerical solutions of determinate equations and indeterminate equations.
In mathematics and abstract algebra, the two-element Boolean algebra is the Boolean algebra whose underlying setB is the Boolean domain. The elements of the Boolean domain are 1 and 0 by convention, so that B = {0, 1}. Paul Halmos's name for this algebra "2" has some following in the literature, and will be employed here.
In mathematics and abstract algebra, a relation algebra is a residuated Boolean algebra expanded with an involution called converse, a unary operation. The motivating example of a relation algebra is the algebra 2 X 2 of all binary relations on a set X, that is, subsets of the cartesian square X2, with R•S interpreted as the usual composition of binary relations R and S, and with the converse of R as the converse relation.
Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.
Boolean algebra is a mathematically rich branch of abstract algebra. Stanford Encyclopaedia of Philosophy defines Boolean algebra as 'the algebra of two-valued logic with only sentential connectives, or equivalently of algebras of sets under union and complementation.' Just as group theory deals with groups, and linear algebra with vector spaces, Boolean algebras are models of the equational theory of the two values 0 and 1. Common to Boolean algebras, groups, and vector spaces is the notion of an algebraic structure, a set closed under some operations satisfying certain equations.
In mathematical logic, algebraic logic is the reasoning obtained by manipulating equations with free variables.
Diagrammatic reasoning is reasoning by means of visual representations. The study of diagrammatic reasoning is about the understanding of concepts and ideas, visualized with the use of diagrams and imagery instead of by linguistic or algebraic means.
A Karnaugh map is a diagram that can be used to simplify a Boolean algebra expression. Maurice Karnaugh introduced it in 1953 as a refinement of Edward W. Veitch's 1952 Veitch chart, which itself was a rediscovery of Allan Marquand's 1881 logical diagram. It is also useful for understanding logic circuits. Karnaugh maps are also known as Marquand–Veitch diagrams, Svoboda charts -(albeit only rarely)- and Karnaugh–Veitch maps.
A truth table is a mathematical table used in logic—specifically in connection with Boolean algebra, Boolean functions, and propositional calculus—which sets out the functional values of logical expressions on each of their functional arguments, that is, for each combination of values taken by their logical variables. In particular, truth tables can be used to show whether a propositional expression is true for all legitimate input values, that is, logically valid.
In logic, a quantifier is an operator that specifies how many individuals in the domain of discourse satisfy an open formula. For instance, the universal quantifier in the first order formula expresses that everything in the domain satisfies the property denoted by . On the other hand, the existential quantifier in the formula expresses that there exists something in the domain which satisfies that property. A formula where a quantifier takes widest scope is called a quantified formula. A quantified formula must contain a bound variable and a subformula specifying a property of the referent of that variable.
In mathematics and mathematical logic, Boolean algebra is a branch of algebra. It differs from elementary algebra in two ways. First, the values of the variables are the truth values true and false, usually denoted 1 and 0, whereas in elementary algebra the values of the variables are numbers. Second, Boolean algebra uses logical operators such as conjunction (and) denoted as ∧, disjunction (or) denoted as ∨, and negation (not) denoted as ¬. Elementary algebra, on the other hand, uses arithmetic operators such as addition, multiplication, subtraction, and division. Boolean algebra is therefore a formal way of describing logical operations in the same way that elementary algebra describes numerical operations.