Relevance logic

Last updated

Relevance logic, also called relevant logic, is a kind of non-classical logic requiring the antecedent and consequent of implications to be relevantly related. They may be viewed as a family of substructural or modal logics. It is generally, but not universally, called relevant logic by British and, especially, Australian logicians, and relevance logic by American logicians.

Contents

Relevance logic aims to capture aspects of implication that are ignored by the "material implication" operator in classical truth-functional logic, namely the notion of relevance between antecedent and conditional of a true implication. This idea is not new: C. I. Lewis was led to invent modal logic, and specifically strict implication, on the grounds that classical logic grants paradoxes of material implication such as the principle that a falsehood implies any proposition. [1] [2] Hence "if I'm a donkey, then two and two is four" is true when translated as a material implication, yet it seems intuitively false since a true implication must tie the antecedent and consequent together by some notion of relevance. And whether or not the speaker is a donkey seems in no way relevant to whether two and two is four.

In terms of a syntactical constraint for a propositional calculus, it is necessary, but not sufficient, that premises and conclusion share atomic formulae (formulae that do not contain any logical connectives). In a predicate calculus, relevance requires sharing of variables and constants between premises and conclusion. This can be ensured (along with stronger conditions) by, e.g., placing certain restrictions on the rules of a natural deduction system. In particular, a Fitch-style natural deduction can be adapted to accommodate relevance by introducing tags at the end of each line of an application of an inference indicating the premises relevant to the conclusion of the inference. Gentzen-style sequent calculi can be modified by removing the weakening rules that allow for the introduction of arbitrary formulae on the right or left side of the sequents.

A notable feature of relevance logics is that they are paraconsistent logics: the existence of a contradiction will not necessarily cause an "explosion." This follows from the fact that a conditional with a contradictory antecedent that does not share any propositional or predicate letters with the consequent cannot be true (or derivable).

History

Relevance logic was proposed in 1928 by Soviet philosopher Ivan E. Orlov (1886 – circa 1936) in his strictly mathematical paper "The Logic of Compatibility of Propositions" published in Matematicheskii Sbornik. The basic idea of relevant implication appears in medieval logic, and some pioneering work was done by Ackermann, [3] Moh, [4] and Church [5] in the 1950s. Drawing on them, Nuel Belnap and Alan Ross Anderson (with others) wrote the magnum opus of the subject, Entailment: The Logic of Relevance and Necessity in the 1970s (the second volume being published in the nineties). They focused on both systems of entailment and systems of relevance, where implications of the former kinds are supposed to be both relevant and necessary.

Axioms

The early developments in relevance logic focused on the stronger systems. The development of the Routley–Meyer semantics brought out a range of weaker logics. The weakest of these logics is the relevance logic B. It is axiomatized with the following axioms and rules.

The rules are the following.

Stronger logics can be obtained by adding any of the following axioms.

There are some notable logics stronger than B that can be obtained by adding axioms to B as follows.

Models

Routley–Meyer models

The standard model theory for relevance logics is the Routley-Meyer ternary-relational semantics developed by Richard Routley and Robert Meyer. A Routley–Meyer frame F for a propositional language is a quadruple (W,R,*,0), where W is a non-empty set, R is a ternary relation on W, and * is a function from W to W, and . A Routley-Meyer model M is a Routley-Meyer frame F together with a valuation, , that assigns a truth value to each atomic proposition relative to each point . There are some conditions placed on Routley-Meyer frames. Define as .

Write and to indicate that the formula is true, or not true, respectively, at point in . One final condition on Routley-Meyer models is the hereditariness condition.

By an inductive argument, hereditariness can be shown to extend to complex formulas, using the truth conditions below.

The truth conditions for complex formulas are as follows.

A formula holds in a model just in case . A formula holds on a frame iff A holds in every model . A formula is valid in a class of frames iff A holds on every frame in that class. The class of all Routley–Meyer frames satisfying the above conditions validates that relevance logic B. One can obtain Routley-Meyer frames for other relevance logics by placing appropriate restrictions on R and on *. These conditions are easier to state using some standard definitions. Let be defined as , and let be defined as . Some of the frame conditions and the axioms they validate are the following.

NameFrame conditionAxiom
Pseudo-modus ponens
Prefixing
Suffixing
Contraction
Conjunctive syllogism
Assertion
E axiom
Mingle axiom or
Reductio
Contraposition
Excluded middle
Strict implication weakening
Weakening

The last two conditions validate forms of weakening that relevance logics were originally developed to avoid. They are included to show the flexibility of the Routley–Meyer models.

Operational models

Urquhart models

Operational models for negation-free fragments of relevance logics were developed by Alasdair Urquhart in his PhD thesis and in subsequent work. The intuitive idea behind the operational models is that points in a model are pieces of information, and combining information supporting a conditional with the information supporting its antecedent yields some information that supports the consequent. Since the operational models do not generally interpret negation, this section will consider only languages with a conditional, conjunction, and disjunction.

An operational frame is a triple , where is a non-empty set, , and is a binary operation on . Frames have conditions, some of which may be dropped to model different logics. The conditions Urquhart proposed to model the conditional of the relevance logic R are the following.

Under these conditions, the operational frame is a join-semilattice.

An operational model is a frame with a valuation that maps pairs of points and atomic propositions to truth values, T or F. can be extended to a valuation on complex formulas as follows.

  • , for atomic propositions
  • and
  • or

A formula holds in a model iff . A formula is valid in a class of models iff it holds in each model .

The conditional fragment of R is sound and complete with respect to the class of semilattice models. The logic with conjunction and disjunction is properly stronger than the conditional, conjunction, disjunction fragment of R. In particular, the formula is valid for the operational models but it is invalid in R. The logic generated by the operational models for R has a complete axiomatic proof system, due Kit Fine and to Gerald Charlwood. Charlwood also provided a natural deduction system for the logic, which he proved equivalent to the axiomatic system. Charlwood showed that his natural deduction system is equivalent to a system provided by Dag Prawitz.

The operational semantics can be adapted to model the conditional of E by adding a non-empty set of worlds and an accessibility relation on to the frames. The accessibility relation is required to be reflexive and transitive, to capture the idea that E's conditional has an S4 necessity. The valuations then map triples of atomic propositions, points, and worlds to truth values. The truth condition for the conditional is changed to the following.

The operational semantics can be adapted to model the conditional of T by adding a relation on . The relation is required to obey the following conditions.

  • If and , then
  • If , then

The truth condition for the conditional is changed to the following.

There are two ways to model the contraction-less relevance logics TW and RW with the operational models. The first way is to drop the condition that . The second way is to keep the semilattice conditions on frames and add a binary relation, , of disjointness to the frame. For these models, the truth conditions for the conditional is changed to the following, with the addition of the ordering in the case of TW.

Humberstone models

Urquhart showed that the semilattice logic for R is properly stronger than the positive fragment of R. Lloyd Humberstone provided an enrichment of the operational models that permitted a different truth condition for disjunction. The resulting class of models generates exactly the positive fragment of R.

An operational frame is a quadruple , where is a non-empty set, , and {, } are binary operations on . Let be defined as . The frame conditions are the following.

  1. , and

An operational model is a frame with a valuation that maps pairs of points and atomic propositions to truth values, T or F. can be extended to a valuation on complex formulas as follows.

  • , for atomic propositions
  • and
  • and
  • or or ; and

A formula holds in a model iff . A formula is valid in a class of models iff it holds in each model .

The positive fragment of R is sound and complete with respect to the class of these models. Humberstone's semantics can be adapted to model different logics by dropping or adding frame conditions as follows.

SystemFrame conditions
B1, 5-9, 14
TW1, 11, 12, 5-9, 14
EW1, 10, 11, 5-9, 14
RW1-3, 5-9
T1, 11, 12, 13, 5-9, 14
E1, 10, 11, 13, 5-9, 14
R1-9
RM1-3, 5-9, 15

Algebraic models

Some relevance logics can be given algebraic models, such as the logic R. The algebraic structures for R are de Morgan monoids, which are sextuples where

The operation interpreting the conditional of R is defined as . A de Morgan monoid is a residuated lattice, obeying the following residuation condition.

An interpretation is a homomorphism from the propositional language to a de Morgan monoid such that

Given a de Morgan monoid and an interpretation , one can say that formula holds on just in case . A formula is valid just in case it holds on all interpretations on all de Morgan monoids. The logic R is sound and complete for de Morgan monoids.

See also

Related Research Articles

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.

<span class="mw-page-title-main">De Morgan's laws</span> Pair of logical equivalences

In propositional logic and Boolean algebra, De Morgan's laws, also known as De Morgan's theorem, are a pair of transformation rules that are both valid rules of inference. They are named after Augustus De Morgan, a 19th-century British mathematician. The rules allow the expression of conjunctions and disjunctions purely in terms of each other via negation.

In boolean logic, a disjunctive normal form (DNF) is a canonical normal form of a logical formula consisting of a disjunction of conjunctions; it can also be described as an OR of ANDs, a sum of products, or — in philosophical logic — a cluster concept. As a normal form, it is useful in automated theorem proving.

In Boolean logic, a formula is in conjunctive normal form (CNF) or clausal normal form if it is a conjunction of one or more clauses, where a clause is a disjunction of literals; otherwise put, it is a product of sums or an AND of ORs. As a canonical normal form, it is useful in automated theorem proving and circuit theory.

In the mathematical discipline of set theory, forcing is a technique for proving consistency and independence results. Intuitively, forcing can be thought of as a technique to expand the set theoretical universe to a larger universe by introducing a new "generic" object .

Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not assume the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.

In mathematics, constructive analysis is mathematical analysis done according to some principles of constructive mathematics.

In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the natural style of deduction used by mathematicians than to David Hilbert's earlier style of formal logic, in which every line was an unconditional tautology. More subtle distinctions may exist; for example, propositions may implicitly depend upon non-logical axioms. In that case, sequents signify conditional theorems in a first-order language rather than conditional tautologies.

In mathematics, a Heyting algebra (also known as pseudo-Boolean algebra) is a bounded lattice (with join and meet operations written ∨ and ∧ and with least element 0 and greatest element 1) equipped with a binary operation ab of implication such that (ca) ≤ b is equivalent to c ≤ (ab). From a logical standpoint, AB is by this definition the weakest proposition for which modus ponens, the inference rule AB, AB, is sound. Like Boolean algebras, Heyting algebras form a variety axiomatizable with finitely many equations. Heyting algebras were introduced by Arend Heyting (1930) to formalize intuitionistic logic.

A paraconsistent logic is an attempt at a logical system to deal with contradictions in a discriminating way. Alternatively, paraconsistent logic is the subfield of logic that is concerned with studying and developing "inconsistency-tolerant" systems of logic, which reject the principle of explosion.

<i>Begriffsschrift</i> 1879 book on logic by Gottlob Frege

Begriffsschrift is a book on logic by Gottlob Frege, published in 1879, and the formal system set out in that book.

Kripke semantics is a formal semantics for non-classical logic systems created in the late 1950s and early 1960s by Saul Kripke and André Joyal. It was first conceived for modal logics, and later adapted to intuitionistic logic and other non-classical systems. The development of Kripke semantics was a breakthrough in the theory of non-classical logics, because the model theory of such logics was almost non-existent before Kripke.

In logic, a rule of inference is admissible in a formal system if the set of theorems of the system does not change when that rule is added to the existing rules of the system. In other words, every formula that can be derived using that rule is already derivable without that rule, so, in a sense, it is redundant. The concept of an admissible rule was introduced by Paul Lorenzen (1955).

In mathematical logic, Heyting arithmetic is an axiomatization of arithmetic in accordance with the philosophy of intuitionism. It is named after Arend Heyting, who first proposed it.

In mathematical logic and automated theorem proving, resolution is a rule of inference leading to a refutation-complete theorem-proving technique for sentences in propositional logic and first-order logic. For propositional logic, systematically applying the resolution rule acts as a decision procedure for formula unsatisfiability, solving the Boolean satisfiability problem. For first-order logic, resolution can be used as the basis for a semi-algorithm for the unsatisfiability problem of first-order logic, providing a more practical method than one following from Gödel's completeness theorem.

In mathematical logic, a tautology is a formula or assertion that is true in every possible interpretation. An example is "x=y or x≠y". Similarly, "either the ball is green, or the ball is not green" is always true, regardless of the colour of the ball.

In logic, a modal companion of a superintuitionistic (intermediate) logic L is a normal modal logic that interprets L by a certain canonical translation, described below. Modal companions share various properties of the original intermediate logic, which enables to study intermediate logics using tools developed for modal logic.

In logic, especially mathematical logic, a Hilbert system, sometimes called Hilbert calculus, Hilbert-style deductive system or Hilbert–Ackermann system, is a type of system of formal deduction attributed to Gottlob Frege and David Hilbert. These deductive systems are most often studied for first-order logic, but are of interest for other logics as well.

In mathematical logic the theory of pure equality is a first-order theory. It has a signature consisting of only the equality relation symbol, and includes no non-logical axioms at all.

References

  1. Lewis, C. I. (1912). "Implication and the Algebra of Logic." Mind , 21(84):522–531.
  2. Lewis, C. I. (1917). "The issues concerning material implication." Journal of Philosophy, Psychology, and Scientific Methods, 14:350–356.
  3. Ackermann, W. (1956), "Begründung einer strengen Implikation", Journal of Symbolic Logic , 21 (2): 113–128, JSTOR   2268750
  4. Moh, Shaw-kwei (1950), "The Deduction Theorems and Two New Logical Systems", Methodos, 2: 56–75 Moh Shaw-Kwei, 1950, "," Methodos 2 56–75.
  5. Church, A. (1951), The Weak Theory of Implication in Kontroliertes Denken: Untersuchungen zum Logikkalkül und zur Logik der Einzelwissenschaften, Kommissions-Verlag Karl Alber, edited by A. Menne, A. Wilhelmy and H. Angsil, pp. 22–37.

Bibliography