Drinker paradox

Last updated

The drinker paradox (also known as the drinker's theorem, the drinker's principle, or the drinking principle) is a theorem of classical predicate logic that can be stated as "There is someone in the pub such that, if he or she is drinking, then everyone in the pub is drinking." It was popularised by the mathematical logician Raymond Smullyan, who called it the "drinking principle" in his 1978 book What Is the Name of this Book? [1]

Contents

The apparently paradoxical nature of the statement comes from the way it is usually stated in natural language. It seems counterintuitive both that there could be a person who is causing the others to drink, or that there could be a person such that all through the night that one person were always the last to drink. The first objection comes from confusing formal "if then" statements with causation (see Correlation does not imply causation or Relevance logic for logics that demand relevant relationships between premise and consequent, unlike classical logic assumed here). The formal statement of the theorem is timeless, eliminating the second objection because the person the statement holds true for at one instant is not necessarily the same person it holds true for at any other instant.[ citation needed ]

The formal statement of the theorem is

where D is an arbitrary predicate and P is an arbitrary nonempty set.

Proofs

The proof begins by recognizing it is true that either everyone in the pub is drinking, or at least one person in the pub is not drinking. Consequently, there are two cases to consider: [1] [2]

  1. Suppose everyone is drinking. For any particular person, it cannot be wrong to say that if that particular person is drinking, then everyone in the pub is drinking—because everyone is drinking. Because everyone is drinking, then that one person must drink because when that person drinks everybody drinks, everybody includes that person. [1] [2]
  2. Otherwise at least one person is not drinking. For any nondrinking person, the statement if that particular person is drinking, then everyone in the pub is drinking is formally true: its antecedent ("that particular person is drinking") is false, therefore the statement is true due to the nature of material implication in formal logic, which states that "If P, then Q" is always true if P is false. [1] [2] (These kinds of statements are said to be vacuously true.)

A slightly more formal way of expressing the above is to say that, if everybody drinks, then anyone can be the witness for the validity of the theorem. And if someone does not drink, then that particular non-drinking individual can be the witness to the theorem's validity. [3]

Explanation of paradoxicality

The paradox is ultimately based on the principle of formal logic that the statement is true whenever A is false, i.e., any statement follows from a false statement [1] ( ex falso quodlibet ).

What is important to the paradox is that the conditional in classical (and intuitionistic) logic is the material conditional. It has the property that is true only if B is true or if A is false (in classical logic, but not intuitionistic logic, this is also a sufficient condition).

So as it was applied here, the statement "if they are drinking, everyone is drinking" was taken to be correct in one case, if everyone was drinking, and in the other case, if they were not drinking—even though their drinking may not have had anything to do with anyone else's drinking.

History and variations

Smullyan in his 1978 book attributes the naming of "The Drinking Principle" to his graduate students. [1] He also discusses variants (obtained by replacing D with other, more dramatic predicates):

As "Smullyan's ‘Drinkers’ principle" or just "Drinkers' principle" it appears in H.P. Barendregt's "The quest for correctness" (1996), accompanied by some machine proofs. [2] Since then it has made regular appearance as an example in publications about automated reasoning; it is sometimes used to contrast the expressiveness of proof assistants. [4]

Non-empty domain

In the setting with empty domains allowed, the drinker paradox must be formulated as follows: [5]

A set P satisfies

if and only if it is non-empty.

Or in words:

If and only if there is someone in the pub, there is someone in the pub such that, if they are drinking, then everyone in the pub is drinking.

See also

Related Research Articles

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

In logic, the semantic principleof bivalence states that every declarative sentence expressing a proposition has exactly one truth value, either true or false. A logic satisfying this principle is called a two-valued logic or bivalent logic.

In logic, proof by contradiction is a form of proof that establishes the truth or the validity of a proposition, by showing that assuming the proposition to be false leads to a contradiction. Although it is quite freely used in mathematical proofs, not every school of mathematical thought accepts this kind of nonconstructive proof as universally valid.

<span class="mw-page-title-main">Saul Kripke</span> American philosopher and logician (1940–2022)

Saul Aaron Kripke was an American analytic philosopher and logician. He was Distinguished Professor of Philosophy at the Graduate Center of the City University of New York and emeritus professor at Princeton University. Kripke is considered one of the most important philosophers of the latter half of the 20th century. Since the 1960s, he has been a central figure in a number of fields related to mathematical and modal logic, philosophy of language and mathematics, metaphysics, epistemology, and recursion theory.

In mathematics and logic, a vacuous truth is a conditional or universal statement that is true because the antecedent cannot be satisfied. It is sometimes said that a statement is vacuously true because it does not really say anything. For example, the statement "all cell phones in the room are turned off" will be true when no cell phones are present in the room. In this case, the statement "all cell phones in the room are turned on" would also be vacuously true, as would the conjunction of the two: "all cell phones in the room are turned on and turned off", which would otherwise be incoherent and false.

Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of computation and also as a basis for the design of functional programming languages. It is based on combinators, which were introduced by Schönfinkel in 1920 with the idea of providing an analogous way to build up functions—and to remove any mention of variables—particularly in predicate logic. A combinator is a higher-order function that uses only function application and earlier defined combinators to define a result from its arguments.

<span class="mw-page-title-main">Negation</span> Logical operation

In logic, negation, also called the logical not or logical complement, is an operation that takes a proposition to another proposition "not ", standing for " is not true", written , or . It is interpreted intuitively as being true when is false, and false when is true. Negation is thus a unary logical connective. It may be applied as an operation on notions, propositions, truth values, or semantic values more generally. In classical logic, negation is normally identified with the truth function that takes truth to falsity. In intuitionistic logic, according to the Brouwer–Heyting–Kolmogorov interpretation, the negation of a proposition is the proposition whose proofs are the refutations of .

Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not assume the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.

In formal logic and related branches of mathematics, a functional predicate, or function symbol, is a logical symbol that may be applied to an object term to produce another object term. Functional predicates are also sometimes called mappings, but that term has additional meanings in mathematics. In a model, a function symbol will be modelled by a function.

<span class="mw-page-title-main">Logical biconditional</span> Concept in logic and mathematics

In logic and mathematics, the logical biconditional, also known as material biconditional or equivalence or biimplication or bientailment, is the logical connective used to conjoin two statements and to form the statement " if and only if ", where is known as the antecedent, and the consequent.

In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the natural style of deduction used by mathematicians than to David Hilbert's earlier style of formal logic, in which every line was an unconditional tautology. More subtle distinctions may exist; for example, propositions may implicitly depend upon non-logical axioms. In that case, sequents signify conditional theorems in a first-order language rather than conditional tautologies.

In propositional logic, double negation is the theorem that states that "If a statement is true, then it is not the case that the statement is not true." This is expressed by saying that a proposition A is logically equivalent to not (not-A), or by the formula A ≡ ~(~A) where the sign ≡ expresses logical equivalence and the sign ~ expresses negation.

A formula of the predicate calculus is in prenex normal form (PNF) if it is written as a string of quantifiers and bound variables, called the prefix, followed by a quantifier-free part, called the matrix. Together with the normal forms in propositional logic, it provides a canonical normal form useful in automated theorem proving.

In mathematical logic, Löb's theorem states that in Peano arithmetic (PA) (or any formal system including PA), for any formula P, if it is provable in PA that "if P is provable in PA then P is true", then P is provable in PA. If Prov(P) means that the formula P is provable, we may express this more formally as

<i>Begriffsschrift</i> 1879 book on logic by Gottlob Frege

Begriffsschrift is a book on logic by Gottlob Frege, published in 1879, and the formal system set out in that book.

Predicate transformer semantics were introduced by Edsger Dijkstra in his seminal paper "Guarded commands, nondeterminacy and formal derivation of programs". They define the semantics of an imperative programming paradigm by assigning to each statement in this language a corresponding predicate transformer: a total function between two predicates on the state space of the statement. In this sense, predicate transformer semantics are a kind of denotational semantics. Actually, in guarded commands, Dijkstra uses only one kind of predicate transformer: the well-known weakest preconditions.

In mathematical logic and automated theorem proving, resolution is a rule of inference leading to a refutation-complete theorem-proving technique for sentences in propositional logic and first-order logic. For propositional logic, systematically applying the resolution rule acts as a decision procedure for formula unsatisfiability, solving the Boolean satisfiability problem. For first-order logic, resolution can be used as the basis for a semi-algorithm for the unsatisfiability problem of first-order logic, providing a more practical method than one following from Gödel's completeness theorem.

In mathematical logic, realizability is a collection of methods in proof theory used to study constructive proofs and extract additional information from them. Formulas from a formal theory are "realized" by objects, known as "realizers", in a way that knowledge of the realizer gives knowledge about the truth of the formula. There are many variations of realizability; exactly which class of formulas is studied and which objects are realizers differ from one variation to another.

Markov's principle, named after Andrey Markov Jr, is a conditional existence statement for which there are many equivalent formulations, as discussed below.

<span class="mw-page-title-main">Peirce's law</span> Axiom used in logic and philosophy

In logic, Peirce's law is named after the philosopher and logician Charles Sanders Peirce. It was taken as an axiom in his first axiomatisation of propositional logic. It can be thought of as the law of excluded middle written in a form that involves only one sort of connective, namely implication.

References

  1. 1 2 3 4 5 6 7 8 Raymond Smullyan (1978). What is the Name of this Book? The Riddle of Dracula and Other Logical Puzzles . Prentice Hall. chapter 14. How to Prove Anything. (topic) 250. The Drinking Principle. pp. 209-211. ISBN   0-13-955088-7.
  2. 1 2 3 4 H.P. Barendregt (1996). "The quest for correctness". Images of SMC Research 1996 (PDF). Stichting Mathematisch Centrum. pp. 54–55. ISBN   978-90-6196-462-9. Archived from the original (PDF) on 2015-07-11. Retrieved 2012-10-27.
  3. Peter J. Cameron (1999). Sets, Logic and Categories. Springer. p. 91. ISBN   978-1-85233-056-9.
  4. Freek Wiedijk. 2001. Mizar Light for HOL Light. In Proceedings of the 14th International Conference on Theorem Proving in Higher Order Logics (TPHOLs '01), Richard J. Boulton and Paul B. Jackson (Eds.). Springer-Verlag, London, UK, 378-394.
  5. Martín Escardó; Paulo Oliva. "Searchable Sets, Dubuc-Penon Compactness, Omniscience Principles, and the Drinker Paradox" (PDF). Computability in Europe 2010: 2.{{cite journal}}: Cite journal requires |journal= (help)