This article needs additional citations for verification .(August 2013) |

The **Three Prisoners problem** appeared in Martin Gardner's "Mathematical Games" column in * Scientific American * in 1959.^{ [1] }^{ [2] } It is mathematically equivalent to the Monty Hall problem with car and goat replaced respectively with execution and freedom.

Three prisoners, A, B, and C, are in separate cells and sentenced to death. The governor has selected one of them at random to be pardoned. The warden knows which one is pardoned, but is not allowed to tell. Prisoner A begs the warden to let him know the identity of one of the two who are going to be executed. "If B is to be pardoned, give me C's name. If C is to be pardoned, give me B's name. And if I'm to be pardoned, secretly flip a coin to decide whether to name B or C."

The warden tells A that B is to be executed. Prisoner A is pleased because he believes that his probability of surviving has gone up from 1/3 to 1/2, as it is now between him and C. Prisoner A secretly tells C the news, who reasons that A's chance of being pardoned is unchanged at 1/3, but he is pleased because his own chance has gone up to 2/3. Which prisoner is correct?

The answer is that prisoner A did not gain any information about his own fate, since he already knew that the warden would give him the name of someone else. Prisoner A, prior to hearing from the warden, estimates his chances of being pardoned as 1/3, the same as both B and C. As the warden says B will be executed, it is either because C will be pardoned (1/3 chance), or A will be pardoned (1/3 chance) *and* the coin to decide whether to name B or C the warden flipped came up B (1/2 chance; for an overall 1/2 × 1/3 = 1/6 chance B was named because A will be pardoned). Hence, after hearing that B will be executed, the estimate of A's chance of being pardoned is half that of C. This means his chances of being pardoned, now knowing B is not, again are 1/3, but C has a 2/3 chance of being pardoned.

The explanation above may be summarised in the following table. As the warden is asked by A, he can only answer B or C to be executed (or "not pardoned").

Being pardoned Warden: "not B" Warden: "not C" Sum A 1/6 1/6 1/3 B 0 1/3 1/3 C 1/3 0 1/3

As the warden has answered that B will not be pardoned, the solution comes from the second column "not B". It appears that the odds for A vs. C to be pardoned are 1:2.

Call , and the events that the corresponding prisoner will be pardoned, and the event that the warden tells A that prisoner B is to be executed, then, using Bayes' theorem, the posterior probability of A being pardoned, is:

The probability of C being pardoned, on the other hand, is:

The crucial difference making A and C unequal is that but . If A will be pardoned, the warden can tell A that either B or C is to be executed, and hence ; whereas if C will be pardoned, the warden can only tell A that B is executed, so .

Prisoner A only has a 1/3 chance of pardon. Knowing whether "B" or "C" will be executed does not change his chance. After he hears B will be executed, Prisoner A realizes that if he will not get the pardon himself it must only be going to C. That means there is a 2/3 chance for C to get a pardon. This is comparable to the Monty Hall problem.

The following scenarios may arise:

- A is pardoned and the warden mentions B to be executed: 1/3 × 1/2 = 1/6 of the cases
- A is pardoned and the warden mentions C to be executed: 1/3 × 1/2 = 1/6 of the cases
- B is pardoned and the warden mentions C to be executed: 1/3 of the cases
- C is pardoned and the warden mentions B to be executed: 1/3 of the cases

With the stipulation that the warden will choose randomly, in the 1/3 of the time that A is to be pardoned, there is a 1/2 chance he will say B and 1/2 chance he will say C. This means that taken overall, 1/6 of the time (1/3 [that A is pardoned] × 1/2 [that warden says B]), the warden will say B because A will be pardoned, and 1/6 of the time (1/3 [that A is pardoned] × 1/2 [that warden says C]) he will say C because A is being pardoned. This adds up to the total of 1/3 of the time (1/6 + 1/6) A is being pardoned, which is accurate.

It is now clear that if the warden answers B to A (1/2 of the time of case 1, and case 4), then 1/3 of the time C is pardoned and A will still be executed (case 4), and only 1/6 of the time A is pardoned (case 1). Hence C's chances are (1/3)/(1/2) = 2/3 and A's are (1/6)/(1/2) = 1/3.

The key to this problem is that the warden *may not* reveal the name of a prisoner who *will* be pardoned. If we eliminate this requirement, it can demonstrate the original problem in another way. The only change in this example is that prisoner A asks the warden to *reveal the fate* of one of the other prisoners (not specifying one that will be executed). In this case, the warden flips a coin and chooses one of B and C to reveal the fate of. The cases are as follows:

- A pardoned, warden says: B executed (1/6)
- A pardoned, warden says: C executed (1/6)
- B pardoned, warden says: B pardoned (1/6)
- B pardoned, warden says: C executed (1/6)
- C pardoned, warden says: B executed (1/6)
- C pardoned, warden says: C pardoned (1/6)

Each scenario has a 1/6 probability. The original Three Prisoners problem can be seen in this light: The warden in that problem still has these six cases, each with a 1/6 probability of occurring. However, the warden in that case *may not* reveal the fate of a pardoned prisoner. Therefore, in the 1/6 of the time that case 3 occurs, since saying B is not an option, the warden says C instead (making it the same as case 4). Similarly, in case 6, the warden must say B instead of C (the same as case 5). That leaves cases 4 and 5 with a 1/3 probability of occurring and leaves us with the same probability as above.

The tendency of people to provide the answer 1/2 neglects to take into account that the warden may have tossed a coin before he gave his answer. The warden may have answered because is to be released and he tossed a coin. Or, is to be released. But the probabilities of the two events are not equal.

Judea Pearl (1988) used a variant of this example to demonstrate that belief updates must depend not merely on the facts observed but also on the experiment (i.e., query) that led to those facts.^{ [3] }

- Monty Hall problem
- Boy or Girl paradox
- Principle of restricted choice, an application in the card game bridge
- Prisoner's dilemma, a game theory problem
- Sleeping Beauty problem
- Two envelopes problem

- ↑ Gardner, Martin (October 1959). "Mathematical Games: Problems involving questions of probability and ambiguity".
*Scientific American*.**201**(4): 174–182. doi:10.1038/scientificamerican1059-174. - ↑ Gardner, Martin (1959). "Mathematical Games: How three modern mathematicians disproved a celebrated conjecture of Leonhard Euler".
*Scientific American*.**201**(5): 188. doi:10.1038/scientificamerican1159-181. - ↑ Pearl, J. (1988).
*Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference*(First ed.). San Mateo, CA: Morgan Kaufmann.

In probability theory and statistics, the **binomial distribution** with parameters *n* and *p* is the discrete probability distribution of the number of successes in a sequence of *n* independent experiments, each asking a yes–no question, and each with its own Boolean-valued outcome: *success* or *failure*. A single success/failure experiment is also called a Bernoulli trial or Bernoulli experiment, and a sequence of outcomes is called a Bernoulli process; for a single trial, i.e., *n* = 1, the binomial distribution is a Bernoulli distribution. The binomial distribution is the basis for the popular binomial test of statistical significance.

**Probability** is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty. The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2.

In probability theory and statistics, **Bayes' theorem**, named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately than simply assuming that the individual is typical of the population as a whole.

**Abraham de Moivre** was a French mathematician known for de Moivre's formula, a formula that links complex numbers and trigonometry, and for his work on the normal distribution and probability theory.

In probability theory, the **birthday problem** asks for the probability that, in a set of n randomly chosen people, at least two will share a birthday. The **birthday paradox** is that, counterintuitively, the probability of a shared birthday exceeds 50% in a group of only 23 people.

In probability theory and statistics, the **hypergeometric distribution** is a discrete probability distribution that describes the probability of successes in draws, *without* replacement, from a finite population of size that contains exactly objects with that feature, wherein each draw is either a success or a failure. In contrast, the binomial distribution describes the probability of successes in draws *with* replacement.

In mathematics, **division by zero** is division where the divisor (denominator) is zero. Such a division can be formally expressed as where a is the dividend (numerator). In ordinary arithmetic, the expression has no meaning, as there is no number which, when multiplied by 0, gives a, and so division by zero is undefined. Since any number multiplied by zero is zero, the expression is also undefined; when it is the form of a limit, it is an indeterminate form. Historically, one of the earliest recorded references to the mathematical impossibility of assigning a value to is contained in Anglo-Irish philosopher George Berkeley’s criticism of infinitesimal calculus in 1734 in *The Analyst*.

In probability theory and statistics, the **beta distribution** is a family of continuous probability distributions defined on the interval [0, 1] parameterized by two positive shape parameters, denoted by *α* and *β*, that appear as exponents of the random variable and control the shape of the distribution. The generalization to multiple variables is called a Dirichlet distribution.

The term * gambler's ruin* is a statistical concept, most commonly expressed as the fact that a gambler playing a game with negative expected value will eventually go broke, regardless of their betting system.

In information theory, the **information content**, **self-information**, **surprisal**, or **Shannon information** is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory.

The **Solovay–Strassen primality test**, developed by Robert M. Solovay and Volker Strassen in 1977, is a probabilistic test to determine if a number is composite or probably prime. The idea behind the test was discovered by M. M. Artjuhov in 1967 (see Theorem E in the paper). This test has been largely superseded by the Baillie–PSW primality test and the Miller–Rabin primality test, but has great historical importance in showing the practical feasibility of the RSA cryptosystem. The Solovay–Strassen test is essentially an Euler–Jacobi pseudoprime test.

In combinatorics, **Bertrand's ballot problem** is the question: "In an election where candidate A receives *p* votes and candidate B receives *q* votes with *p* > *q*, what is the probability that A will be strictly ahead of B throughout the count?" The answer is

In probability theory, a distribution is said to be **stable** if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be **stable** if its distribution is stable. The stable distribution family is also sometimes referred to as the **Lévy alpha-stable distribution**, after Paul Lévy, the first mathematician to have studied it.

The **two envelopes problem**, also known as the **exchange paradox**, is a brain teaser, puzzle, or paradox in logic, probability, and recreational mathematics. It is of special interest in decision theory, and for the Bayesian interpretation of probability theory. Historically, it arose as a variant of the necktie paradox. The problem typically is introduced by formulating a hypothetical challenge of the following type:

You are given two indistinguishable envelopes, each containing money. One contains twice as much as the other. You may pick one envelope and keep the money it contains. Having chosen an envelope at will, but before inspecting it, you are given the chance to switch envelopes. Should you switch?

The **Boy or Girl paradox** surrounds a set of questions in probability theory, which are also known as **The Two Child Problem**, **Mr. Smith's Children** and the **Mrs. Smith Problem**. The initial formulation of the question dates back to at least 1959, when Martin Gardner featured it in his October 1959 "Mathematical Games column" in *Scientific American*. He titled it **The Two Children Problem**, and phrased the paradox as follows:

**Lottery mathematics** is used to calculate probabilities of winning or losing a lottery game. It is based primarily on combinatorics, particularly the twelvefold way and combinations without replacement.

The **Monty Hall problem** is a brain teaser, in the form of a probability puzzle, loosely based on the American television game show *Let's Make a Deal* and named after its original host, Monty Hall. The problem was originally posed in a letter by Steve Selvin to the *American Statistician* in 1975. It became famous as a question from reader Craig F. Whitaker's letter quoted in Marilyn vos Savant's "Ask Marilyn" column in *Parade* magazine in 1990:

Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?

**Bertrand's box paradox** is a paradox of elementary probability theory, first posed by Joseph Bertrand in his 1889 work *Calcul des probabilités*.

Beliefs depend on the available information. This idea is formalized in probability theory by **conditioning**. Conditional probabilities, conditional expectations, and conditional probability distributions are treated on three levels: discrete probabilities, probability density functions, and measure theory. Conditioning leads to a non-random result if the condition is completely specified; otherwise, if the condition is left random, the result of conditioning is also random.

In probability theory, **conditional probability** is a measure of the probability of an event occurring, given that another event has already occurred. If the event of interest is A and the event B is known or assumed to have occurred, "the conditional probability of A given B", or "the probability of A under the condition B", is usually written as P(*A*|*B*) or occasionally P_{B}(*A*). This can also be understood as the fraction of probability B that intersects with A: .

- Frederick Mosteller:
*Fifty Challenging Problems in Probability*. Dover 1987 (reprint), ISBN 0-486-65355-2, p. 28-29 (*restricted online version*, p. 28, at Google Books) - Richard Isaac:
*Pleasures of Probability*. Springer 1995, ISBN 978-0-387-94415-9, p. 24-27 (*restricted online version*, p. 24, at Google Books)

This page is based on this Wikipedia article

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.