Three prisoners problem

Last updated

The three prisoners problem appeared in Martin Gardner's "Mathematical Games" column in Scientific American in 1959. [1] [2] It is mathematically equivalent to the Monty Hall problem with car and goat replaced respectively with freedom and execution. [3]

Contents

Problem

Three prisoners, A, B, and C, are in separate cells and sentenced to death. The governor has selected one of them at random to be pardoned. The warden knows which one is pardoned, but is not allowed to tell. Prisoner A begs the warden to let him know the identity of one of the two who are going to be executed. "If B is to be pardoned, give me C's name. If C is to be pardoned, give me B's name. And if I'm to be pardoned, secretly flip a coin to decide whether to name B or C."

The warden tells A that B is to be executed. Prisoner A is pleased because he believes that his probability of surviving has gone up from 1/3 to 1/2, as it is now between him and C. Prisoner A secretly tells C the news, who reasons that A's chance of being pardoned is unchanged at 1/3, but he is pleased because his own chance has gone up to 2/3. Which prisoner is correct?

Solution

The answer is that prisoner A did not gain any information about his own fate, since he already knew that the warden would give him the name of someone else. Prisoner A, prior to hearing from the warden, estimates his chances of being pardoned as 1/3, the same as both B and C. As the warden says B will be executed, it is either because C will be pardoned (1/3 chance), or A will be pardoned (1/3 chance) and the coin to decide whether to name B or C the warden flipped came up B (1/2 chance; for an overall 1/2 × 1/3 = 1/6 chance B was named because A will be pardoned). Hence, after hearing that B will be executed, the estimate of A's chance of being pardoned is half that of C. This means his chances of being pardoned, now knowing B is not, again are 1/3, but C has a 2/3 chance of being pardoned.

Table

The explanation above may be summarised in the following table. As the warden is asked by A, he can only answer B or C to be executed (or "not pardoned").

Being pardonedWarden: "not B"Warden: "not C"Sum
A1/61/61/3
B01/31/3
C1/301/3

As the warden has answered that B will not be pardoned, the solution comes from the second column "not B". It appears that the odds for A vs. C to be pardoned are 1:2.

Mathematical formulation

Call , and the events that the corresponding prisoner will be pardoned, and the event that the warden tells A that prisoner B is to be executed, then, using Bayes' theorem, the posterior probability of A being pardoned, is: [4]

The probability of C being pardoned, on the other hand, is:

The crucial difference making A and C unequal is that but . If A will be pardoned, the warden can tell A that either B or C is to be executed, and hence ; whereas if C will be pardoned, the warden can only tell A that B is executed, so .

An intuitive explanation

Prisoner A only has a 1/3 chance of pardon. Knowing whether B or C will be executed does not change his chance. After he hears B will be executed, Prisoner A realizes that if he will not get the pardon himself it must only be going to C. That means there is a 2/3 chance for C to get a pardon. This is comparable to the Monty Hall problem.

Enumeration of possible cases

The following scenarios may arise:

  1. A is pardoned and the warden mentions B to be executed: 1/3 × 1/2 = 1/6 of the cases
  2. A is pardoned and the warden mentions C to be executed: 1/3 × 1/2 = 1/6 of the cases
  3. B is pardoned and the warden mentions C to be executed: 1/3 of the cases
  4. C is pardoned and the warden mentions B to be executed: 1/3 of the cases

With the stipulation that the warden will choose randomly, in the 1/3 of the time that A is to be pardoned, there is a 1/2 chance he will say B and 1/2 chance he will say C. This means that taken overall, 1/6 of the time (1/3 [that A is pardoned] × 1/2 [that warden says B]), the warden will say B because A will be pardoned, and 1/6 of the time (1/3 [that A is pardoned] × 1/2 [that warden says C]) he will say C because A is being pardoned. This adds up to the total of 1/3 of the time (1/6 + 1/6) A is being pardoned, which is accurate.

It is now clear that if the warden answers B to A (1/2 of all cases), then 1/3 of the time C is pardoned and A will still be executed (case 4), and only 1/6 of the time A is pardoned (case 1). Hence C's chances are (1/3)/(1/2) = 2/3 and A's are (1/6)/(1/2) = 1/3.

The key to this problem is that the warden may not reveal the name of a prisoner who will be pardoned. If we eliminate this requirement, it can demonstrate the original problem in another way. The only change in this example is that prisoner A asks the warden to reveal the fate of one of the other prisoners (not specifying one that will be executed). In this case, the warden flips a coin and chooses one of B and C to reveal the fate of. The cases are as follows:

  1. A pardoned, warden says: B executed (1/6)
  2. A pardoned, warden says: C executed (1/6)
  3. B pardoned, warden says: B pardoned (1/6)
  4. B pardoned, warden says: C executed (1/6)
  5. C pardoned, warden says: B executed (1/6)
  6. C pardoned, warden says: C pardoned (1/6)

Each scenario has a 1/6 probability. The original three prisoners problem can be seen in this light: The warden in that problem still has these six cases, each with a 1/6 probability of occurring. However, the warden in the original case cannot reveal the fate of a pardoned prisoner. Therefore, in case 3 for example, since saying "B is pardoned" is not an option, the warden says "C is executed" instead (making it the same as case 4). That leaves cases 4 and 5 each with a 1/3 probability of occurring and leaves us with the same probability as before.

Why the paradox?

The tendency of people to provide the answer 1/2 is likely due to a tendency to ignore context that may seem unimpactful. For example, how the question is posed to the warden can affect the answer. This can be shown by considering a modified case, where and everything else about the problem remains the same. [4] Using Bayes' Theorem once again:

However, if A simply asks if B will be executed, and the warden responds with "yes", the probability that A is pardoned becomes:

[4]

A similar assumption is that A plans beforehand to ask the warden for this information. A similar case to the above arises if A does not plan to ask the warden anything and the warden simply informs him that he will be executing B. [5]

Another likely overlooked assumption is that the warden has a probabilistic choice. Let us define as the conditional probability that the warden will name B given that C will be executed. The conditional probability can be then expressed as: [6]

If we assume that , that is, that we do not take into account that the warden is making a probabilistic choice, then . However, the reality of the problem is that the warden is flipping a coin (), so . [5]

Judea Pearl (1988) used a variant of this example to demonstrate that belief updates must depend not merely on the facts observed but also on the experiment (i.e., query) that led to those facts. [7]

Related Research Articles

<span class="mw-page-title-main">Expected value</span> Average value of a random variable

In probability theory, the expected value is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of the possible values a random variable can take, weighted by the probability of those outcomes. Since it is obtained through arithmetic, the expected value sometimes may not even be included in the sample data set; it is not the value you would "expect" to get in reality.

<span class="mw-page-title-main">Probability</span> Branch of mathematics concerning chance and uncertainty

Probability is the branch of mathematics concerning events and numerical descriptions of how likely they are to occur. The probability of an event is a number between 0 and 1; the larger the probability, the more likely an event is to occur. The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2.

<span class="mw-page-title-main">Taylor series</span> Mathematical approximation of a function

In mathematics, the Taylor series or Taylor expansion of a function is an infinite sum of terms that are expressed in terms of the function's derivatives at a single point. For most common functions, the function and the sum of its Taylor series are equal near this point. Taylor series are named after Brook Taylor, who introduced them in 1715. A Taylor series is also called a Maclaurin series when 0 is the point where the derivatives are considered, after Colin Maclaurin, who made extensive use of this special case of Taylor series in the 18th century.

<span class="mw-page-title-main">Division (mathematics)</span> Arithmetic operation

Division is one of the four basic operations of arithmetic. The other operations are addition, subtraction, and multiplication. What is being divided is called the dividend, which is divided by the divisor, and the result is called the quotient.

<span class="mw-page-title-main">Pigeonhole principle</span> If there are more items than boxes holding them, one box must contain at least two items

In mathematics, the pigeonhole principle states that if n items are put into m containers, with n > m, then at least one container must contain more than one item. For example, of three gloves, at least two must be right-handed or at least two must be left-handed, because there are three objects but only two categories of handedness to put them into. This seemingly obvious statement, a type of counting argument, can be used to demonstrate possibly unexpected results. For example, given that the population of London is greater than the maximum number of hairs that can be on a human's head, the principle requires that there must be at least two people in London who have the same number of hairs on their heads.

<span class="mw-page-title-main">Abraham de Moivre</span> French mathematician (1667–1754)

Abraham de Moivre FRS was a French mathematician known for de Moivre's formula, a formula that links complex numbers and trigonometry, and for his work on the normal distribution and probability theory.

<span class="mw-page-title-main">Quadratic formula</span> Formula that provides the solutions to a quadratic equation

In elementary algebra, the quadratic formula is a formula that provides the solutions to a quadratic equation. Other ways of solving a quadratic equation, such as completing the square, yield the same solutions.

<span class="mw-page-title-main">Birthday problem</span> Mathematical problem

In probability theory, the birthday problem asks for the probability that, in a set of n randomly chosen people, at least two will share a birthday. The birthday paradox refers to the counterintuitive fact that only 23 people are needed for that probability to exceed 50%.

<span class="mw-page-title-main">Hypergeometric distribution</span> Discrete probability distribution

In probability theory and statistics, the hypergeometric distribution is a discrete probability distribution that describes the probability of successes in draws, without replacement, from a finite population of size that contains exactly objects with that feature, wherein each draw is either a success or a failure. In contrast, the binomial distribution describes the probability of successes in draws with replacement.

<span class="mw-page-title-main">Beta distribution</span> Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution.

In statistics, gambler's ruin is the fact that a gambler playing a game with negative expected value will eventually go broke, regardless of their betting system.

In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory.

<span class="mw-page-title-main">Fraction</span> Ratio of two numbers

A fraction represents a part of a whole or, more generally, any number of equal parts. When spoken in everyday English, a fraction describes how many parts of a certain size there are, for example, one-half, eight-fifths, three-quarters. A common, vulgar, or simple fraction consists of an integer numerator, displayed above a line, and a non-zero integer denominator, displayed below that line. If these integers are positive, then the numerator represents a number of equal parts, and the denominator indicates how many of those parts make up a unit or a whole. For example, in the fraction 3/4, the numerator 3 indicates that the fraction represents 3 equal parts, and the denominator 4 indicates that 4 parts make up a whole. The picture to the right illustrates 3/4 of a cake.

<span class="mw-page-title-main">Boy or girl paradox</span> Paradox in probability theory

The Boy or Girl paradox surrounds a set of questions in probability theory, which are also known as The Two Child Problem, Mr. Smith's Children and the Mrs. Smith Problem. The initial formulation of the question dates back to at least 1959, when Martin Gardner featured it in his October 1959 "Mathematical Games column" in Scientific American. He titled it The Two Children Problem, and phrased the paradox as follows:

Lottery mathematics is used to calculate probabilities of winning or losing a lottery game. It is based primarily on combinatorics, particularly the twelvefold way and combinations without replacement.

<span class="mw-page-title-main">Monty Hall problem</span> Probability puzzle

The Monty Hall problem is a brain teaser, in the form of a probability puzzle, based nominally on the American television game show Let's Make a Deal and named after its original host, Monty Hall. The problem was originally posed in a letter by Steve Selvin to the American Statistician in 1975. It became famous as a question from reader Craig F. Whitaker's letter quoted in Marilyn vos Savant's "Ask Marilyn" column in Parade magazine in 1990:

Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?

Bertrand's box paradox is a veridical paradox in elementary probability theory. It was first posed by Joseph Bertrand in his 1889 work Calcul des Probabilités.

Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations, and conditional probability distributions are treated on three levels: discrete probabilities, probability density functions, and measure theory. Conditioning leads to a non-random result if the condition is completely specified; otherwise, if the condition is left random, the result of conditioning is also random.

<span class="mw-page-title-main">Conditional probability</span> Probability of an event occurring, given that another event has already occurred

In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) is already known to have occurred. This particular method relies on event A occurring with some sort of relationship with another event B. In this situation, the event A can be analyzed by a conditional probability with respect to B. If the event of interest is A and the event B is known or assumed to have occurred, "the conditional probability of A given B", or "the probability of A under the condition B", is usually written as P(A|B) or occasionally PB(A). This can also be understood as the fraction of probability B that intersects with A, or the ratio of the probabilities of both events happening to the "given" one happening (how many times A occurs rather than not assuming B has occurred): .

<span class="mw-page-title-main">100 prisoners problem</span> Mathematics problem

The 100 prisoners problem is a mathematical problem in probability theory and combinatorics. In this problem, 100 numbered prisoners must find their own numbers in one of 100 drawers in order to survive. The rules state that each prisoner may open only 50 drawers and cannot communicate with other prisoners. At first glance, the situation appears hopeless, but a clever strategy offers the prisoners a realistic chance of survival.

References

  1. Gardner, Martin (October 1959). "Mathematical Games: Problems involving questions of probability and ambiguity". Scientific American. 201 (4): 174–182. doi:10.1038/scientificamerican1059-174.
  2. Gardner, Martin (1959). "Mathematical Games: How three modern mathematicians disproved a celebrated conjecture of Leonhard Euler". Scientific American. 201 (5): 188. doi:10.1038/scientificamerican1159-181.
  3. Bailey, Herb (2000). "Monty Hall Uses a Mixed Strategy". Mathematics Magazine. 73 (2): 135–141. JSTOR   2691085.
  4. 1 2 3 Shimojo, Shinsuke; Ichikawa, Shin'Ichi (August 1990). "Intuitive reasoning about probability: Theoretical and experimental analyses of the "problem of three prisoners"". Cognition. 36 (2): 205. doi:10.1016/0010-0277(89)90012-7. PMID   2752704. S2CID   45658299.
  5. 1 2 Wechsler, Sergio; Esteves, L. G.; Simonis, A.; Peixoto, C. (February 2005). "Indifference, Neutrality and Informativeness: Generalizing the Three Prisoners Paradox". Synthese. 143 (3): 255–272. doi:10.1007/s11229-005-7016-1. JSTOR   20118537. S2CID   16773272 . Retrieved 15 December 2021.
  6. Billingsley, Patrick (1995). Probability and measure. Wiley Series in Probability and Mathematical Statistics (Third edition of 1979 original ed.). New York: John Wiley & Sons, Inc. Exercise 33.3, pp. 441 and 576. ISBN   0-471-00710-2. MR   1324786.
  7. Pearl, J. (1988). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (First ed.). San Mateo, CA: Morgan Kaufmann.

Further reading