Logical consequence

Last updated

Logical consequence (also entailment) is a fundamental concept in logic which describes the relationship between statements that hold true when one statement logically follows from one or more statements. A valid logical argument is one in which the conclusion is entailed by the premises, because the conclusion is the consequence of the premises. The philosophical analysis of logical consequence involves the questions: In what sense does a conclusion follow from its premises? and What does it mean for a conclusion to be a consequence of premises? [1] All of philosophical logic is meant to provide accounts of the nature of logical consequence and the nature of logical truth. [2]

Contents

Logical consequence is necessary and formal, by way of examples that explain with formal proof and models of interpretation. [1] A sentence is said to be a logical consequence of a set of sentences, for a given language, if and only if, using only logic (i.e., without regard to any personal interpretations of the sentences) the sentence must be true if every sentence in the set is true. [3]

Logicians make precise accounts of logical consequence regarding a given language , either by constructing a deductive system for or by formal intended semantics for language . The Polish logician Alfred Tarski identified three features of an adequate characterization of entailment: (1) The logical consequence relation relies on the logical form of the sentences: (2) The relation is a priori, i.e., it can be determined with or without regard to empirical evidence (sense experience); and (3) The logical consequence relation has a modal component. [3]

Formal accounts

The most widely prevailing view on how best to account for logical consequence is to appeal to formality. This is to say that whether statements follow from one another logically depends on the structure or logical form of the statements without regard to the contents of that form.

Syntactic accounts of logical consequence rely on schemes using inference rules. For instance, we can express the logical form of a valid argument as:

All X are Y
All Y are Z
Therefore, all X are Z.

This argument is formally valid, because every instance of arguments constructed using this scheme is valid.

This is in contrast to an argument like "Fred is Mike's brother's son. Therefore Fred is Mike's nephew." Since this argument depends on the meanings of the words "brother", "son", and "nephew", the statement "Fred is Mike's nephew" is a so-called material consequence of "Fred is Mike's brother's son", not a formal consequence. A formal consequence must be true in all cases, however this is an incomplete definition of formal consequence, since even the argument "P is Q's brother's son, therefore P is Q's nephew" is valid in all cases, but is not a formal argument. [1]

A priori property of logical consequence

If it is known that follows logically from , then no information about the possible interpretations of or will affect that knowledge. Our knowledge that is a logical consequence of cannot be influenced by empirical knowledge. [1] Deductively valid arguments can be known to be so without recourse to experience, so they must be knowable a priori. [1] However, formality alone does not guarantee that logical consequence is not influenced by empirical knowledge. So the a priori property of logical consequence is considered to be independent of formality. [1]

Proofs and models

The two prevailing techniques for providing accounts of logical consequence involve expressing the concept in terms of proofs and via models. The study of the syntactic consequence (of a logic) is called (its) proof theory whereas the study of (its) semantic consequence is called (its) model theory. [4]

Syntactic consequence

A formula is a syntactic consequence [5] [6] [7] [8] [9] within some formal system of a set of formulas if there is a formal proof in of from the set . This is denoted . The turnstile symbol was originally introduced by Frege in 1879, but its current use only dates back to Rosser and Kleene (1934–1935). [9]

Syntactic consequence does not depend on any interpretation of the formal system. [10]

Semantic consequence

A formula is a semantic consequence within some formal system of a set of statements if and only if there is no model in which all members of are true and is false. [11] This is denoted . Or, in other words, the set of the interpretations that make all members of true is a subset of the set of the interpretations that make true.

Modal accounts of logical consequence are variations on the following basic idea:

is true if and only if it is necessary that if all of the elements of are true, then is true.

Alternatively (and, most would say, equivalently):

is true if and only if it is impossible for all of the elements of to be true and false.

Such accounts are called "modal" because they appeal to the modal notions of logical necessity and logical possibility. 'It is necessary that' is often expressed as a universal quantifier over possible worlds, so that the accounts above translate as:

is true if and only if there is no possible world at which all of the elements of are true and is false (untrue).

Consider the modal account in terms of the argument given as an example above:

All frogs are green.
Kermit is a frog.
Therefore, Kermit is green.

The conclusion is a logical consequence of the premises because we can not imagine a possible world where (a) all frogs are green; (b) Kermit is a frog; and (c) Kermit is not green.

Modal-formal accounts of logical consequence combine the modal and formal accounts above, yielding variations on the following basic idea:

if and only if it is impossible for an argument with the same logical form as / to have true premises and a false conclusion.

Warrant-based accounts

The accounts considered above are all "truth-preservational", in that they all assume that the characteristic feature of a good inference is that it never allows one to move from true premises to an untrue conclusion. As an alternative, some have proposed "warrant-preservational" accounts, according to which the characteristic feature of a good inference is that it never allows one to move from justifiably assertible premises to a conclusion that is not justifiably assertible. This is (roughly) the account favored by intuitionists such as Michael Dummett.

Non-monotonic logical consequence

The accounts discussed above all yield monotonic consequence relations, i.e. ones such that if is a consequence of , then is a consequence of any superset of . It is also possible to specify non-monotonic consequence relations to capture the idea that, e.g., 'Tweety can fly' is a logical consequence of

{Birds can typically fly, Tweety is a bird}

but not of

{Birds can typically fly, Tweety is a bird, Tweety is a penguin}.

See also

Notes

  1. 1 2 3 4 5 6 Beall, JC and Restall, Greg, Logical Consequence The Stanford Encyclopedia of Philosophy (Fall 2009 Edition), Edward N. Zalta (ed.).
  2. Quine, Willard Van Orman, Philosophy of Logic.
  3. 1 2 McKeon, Matthew, Logical Consequence Internet Encyclopedia of Philosophy.
  4. Kosta Dosen (1996). "Logical consequence: a turn in style". In Maria Luisa Dalla Chiara; Kees Doets; Daniele Mundici; Johan van Benthem (eds.). Logic and Scientific Methods: Volume One of the Tenth International Congress of Logic, Methodology and Philosophy of Science, Florence, August 1995. Springer. p. 292. ISBN   978-0-7923-4383-7.
  5. Dummett, Michael (1993) philosophy of language Harvard University Press, p.82ff
  6. Lear, Jonathan (1986) and Logical Theory Cambridge University Press, 136p.
  7. Creath, Richard, and Friedman, Michael (2007) Cambridge companion to Carnap Cambridge University Press, 371p.
  8. FOLDOC: "syntactic consequence" Archived 2013-04-03 at the Wayback Machine
  9. 1 2 S. C. Kleene, Introduction to Metamathematics (1952), Van Nostrand Publishing. p.88.
  10. Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of California Press, 1971, p. 75.
  11. Etchemendy, John, Logical consequence, The Cambridge Dictionary of Philosophy

Resources

Related Research Articles

In classical logic, disjunctive syllogism is a valid argument form which is a syllogism having a disjunctive statement for one of its premises.

Disjunction introduction or addition is a rule of inference of propositional logic and almost every other deduction system. The rule makes it possible to introduce disjunctions to logical proofs. It is the inference that if P is true, then P or Q must be true.

<span class="mw-page-title-main">Gödel's completeness theorem</span> Fundamental theorem in mathematical logic

Gödel's completeness theorem is a fundamental theorem in mathematical logic that establishes a correspondence between semantic truth and syntactic provability in first-order logic.

Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.

In propositional logic, modus tollens (MT), also known as modus tollendo tollens and denying the consequent, is a deductive argument form and a rule of inference. Modus tollens is a mixed hypothetical syllogism that takes the form of "If P, then Q. Not Q. Therefore, not P." It is an application of the general truth that if a statement is true, then so is its contrapositive. The form shows that inference from P implies Q to the negation of Q implies the negation of P is a valid argument.

In logic and deductive reasoning, an argument is sound if it is both valid in form and its premises are true. Soundness has a related meaning in mathematical logic, wherein a formal system of logic is sound if and only if every well-formed formula that can be proven in the system is logically valid with respect to the logical semantics of the system.

Deductive reasoning is the mental process of drawing deductive inferences. An inference is deductively valid if its conclusion follows logically from its premises, i.e. it is impossible for the premises to be true and the conclusion to be false.

In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the natural style of deduction used by mathematicians than to David Hilbert's earlier style of formal logic, in which every line was an unconditional tautology. More subtle distinctions may exist; for example, propositions may implicitly depend upon non-logical axioms. In that case, sequents signify conditional theorems in a first-order language rather than conditional tautologies.

In mathematical logic, a sequent is a very general kind of conditional assertion.

In logic, a substructural logic is a logic lacking one of the usual structural rules, such as weakening, contraction, exchange or associativity. Two of the more significant substructural logics are relevance logic and linear logic.

A paraconsistent logic is an attempt at a logical system to deal with contradictions in a discriminating way. Alternatively, paraconsistent logic is the subfield of logic that is concerned with studying and developing "inconsistency-tolerant" systems of logic which reject the principle of explosion.

Monotonicity of entailment is a property of many logical systems such that if a sentence follows deductively from a given set of sentences then it also follows deductively from any superset of those sentences. A corollary is that if a given argument is deductively valid, it cannot become invalid by the addition of extra premises.

In logic, a rule of inference is admissible in a formal system if the set of theorems of the system does not change when that rule is added to the existing rules of the system. In other words, every formula that can be derived using that rule is already derivable without that rule, so, in a sense, it is redundant. The concept of an admissible rule was introduced by Paul Lorenzen (1955).

Epistemic modal logic is a subfield of modal logic that is concerned with reasoning about knowledge. While epistemology has a long philosophical tradition dating back to Ancient Greece, epistemic logic is a much more recent development with applications in many fields, including philosophy, theoretical computer science, artificial intelligence, economics and linguistics. While philosophers since Aristotle have discussed modal logic, and Medieval philosophers such as Avicenna, Ockham, and Duns Scotus developed many of their observations, it was C. I. Lewis who created the first symbolic and systematic approach to the topic, in 1912. It continued to mature as a field, reaching its modern form in 1963 with the work of Kripke.

The paradoxes of material implication are a group of true formulae involving material conditionals whose translations into natural language are intuitively false when the conditional is translated as "if ... then ...". A material conditional formula is true unless is true and is false. If natural language conditionals were understood in the same way, that would mean that the sentence "If the Nazis had won World War Two, everybody would be happy" is vacuously true. Given that such problematic consequences follow from a seemingly correct assumption about logic, they are called paradoxes. They demonstrate a mismatch between classical logic and robust intuitions about meaning and reasoning.

Philosophy of logic is the area of philosophy that studies the scope and nature of logic. It investigates the philosophical problems raised by logic, such as the presuppositions often implicitly at work in theories of logic and in their application. This involves questions about how logic is to be defined and how different logical systems are connected to each other. It includes the study of the nature of the fundamental concepts used by logic and the relation of logic to other disciplines. According to a common characterization, philosophical logic is the part of the philosophy of logic that studies the application of logical methods to philosophical problems, often in the form of extended logical systems like modal logic. But other theorists draw the distinction between the philosophy of logic and philosophical logic differently or not at all. Metalogic is closely related to the philosophy of logic as the discipline investigating the properties of formal logical systems, like consistency and completeness.

In logic, the symbol ⊨, ⊧ or is called the double turnstile. It is often read as "entails", "models", "is a semanticconsequence of" or "is stronger than". It is closely related to the turnstile symbol , which has a single bar across the middle, and which denotes syntactic consequence.

In mathematical logic and metalogic, a formal system is called complete with respect to a particular property if every formula having the property can be derived using that system, i.e. is one of its theorems; otherwise the system is said to be incomplete. The term "complete" is also used without qualification, with differing meanings depending on the context, mostly referring to the property of semantical validity. Intuitively, a system is called complete in this particular sense, if it can derive every formula that is true.

<span class="mw-page-title-main">Logic</span> Study of correct reasoning

Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the science of deductively valid inferences or logical truths. It studies how conclusions follow from premises due to the structure of arguments alone, independent of their topic and content. Informal logic is associated with informal fallacies, critical thinking, and argumentation theory. It examines arguments expressed in natural language while formal logic uses formal language. When used as a countable noun, the term "a logic" refers to a logical formal system that articulates a proof system. Logic plays a central role in many fields, such as philosophy, mathematics, computer science, and linguistics.