The Doctrine of Chances

Last updated
Front page of the 1st edition of the "Doctrine of Chances". Abraham de Moivre - Doctrine of Chance - 1718.gif
Front page of the 1st edition of the “Doctrine of Chances”.

The Doctrine of Chances was the first textbook on probability theory, written by 18th-century French mathematician Abraham de Moivre and first published in 1718. [1] De Moivre wrote in English because he resided in England at the time, having fled France to escape the persecution of Huguenots. The book's title came to be synonymous with probability theory, and accordingly the phrase was used in Thomas Bayes' famous posthumous paper An Essay towards solving a Problem in the Doctrine of Chances , wherein a version of Bayes' theorem was first introduced.

Contents

Editions

The full title of the first edition was The doctrine of chances: or, a method for calculating the probabilities of events in play; it was published in 1718, by W. Pearson, and ran for 175 pages. Published in 1738 by Woodfall and running for 258 pages, the second edition of de Moivre's book introduced the concept of normal distributions as approximations to binomial distributions. In effect de Moivre proved a special case of the central limit theorem. Sometimes his result is called the theorem of de Moivre–Laplace. A third edition was published posthumously in 1756 by A. Millar, and ran for 348 pages; additional material in this edition included an application of probability theory to actuarial science in the calculation of annuities. [1]

Related Research Articles

Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.

<span class="mw-page-title-main">Frequentist probability</span> Interpretation of probability

Frequentist probability or frequentism is an interpretation of probability; it defines an event's probability as the limit of its relative frequency in many trials. Probabilities can be found by a repeatable objective process. The continued use of frequentist methods in scientific inference, however, has been called into question.

<span class="mw-page-title-main">Probability</span> Branch of mathematics concerning chance and uncertainty

Probability is the branch of mathematics concerning events and numerical descriptions of how likely they are to occur. The probability of an event is a number between 0 and 1; the larger the probability, the more likely an event is to occur. The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes are both equally probable; the probability of 'heads' equals the probability of 'tails'; and since no other outcomes are possible, the probability of either 'heads' or 'tails' is 1/2.

<span class="mw-page-title-main">Statistical inference</span> Process of using data analysis

Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.

In probability theory and statistics, Bayes' theorem, named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately by conditioning it relative to their age, rather than simply assuming that the individual is typical of the population as a whole.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Fundamentally, Bayesian inference uses prior knowledge, in the form of a prior distribution in order to estimate posterior probabilities. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

<span class="mw-page-title-main">Abraham de Moivre</span> French mathematician (1667–1754)

Abraham de Moivre FRS was a French mathematician known for de Moivre's formula, a formula that links complex numbers and trigonometry, and for his work on the normal distribution and probability theory.

<span class="mw-page-title-main">Thomas Bayes</span> British statistician (c. 1701 – 1761)

Thomas Bayes was an English statistician, philosopher and Presbyterian minister who is known for formulating a specific case of the theorem that bears his name: Bayes' theorem. Bayes never published what would become his most famous accomplishment; his notes were edited and published posthumously by Richard Price.

This is a list of significant events that occurred in the year 1718 in science.

<span class="mw-page-title-main">Thomas Simpson</span> British mathematician and inventor

Thomas Simpson FRS was a British mathematician and inventor known for the eponymous Simpson's rule to approximate definite integrals. The attribution, as often in mathematics, can be debated: this rule had been found 100 years earlier by Johannes Kepler, and in German it is called Keplersche Fassregel.

Statistics, in the modern sense of the word, began evolving in the 18th century in response to the novel needs of industrializing sovereign states.

<i>Ars Conjectandi</i> 1713 book on probability and combinatorics by Jacob Bernoulli

Ars Conjectandi is a book on combinatorics and mathematical probability written by Jacob Bernoulli and published in 1713, eight years after his death, by his nephew, Niklaus Bernoulli. The seminal work consolidated, apart from many combinatorial topics, many central ideas in probability theory, such as the very first version of the law of large numbers: indeed, it is widely regarded as the founding work of that subject. It also addressed problems that today are classified in the twelvefold way and added to the subjects; consequently, it has been dubbed an important historical landmark in not only probability but all combinatorics by a plethora of mathematical historians. The importance of this early work had a large impact on both contemporary and later mathematicians; for example, Abraham de Moivre.

The following is a timeline of probability and statistics.

<span class="mw-page-title-main">De Moivre–Laplace theorem</span> Convergence in distribution of binomial to normal distribution

In probability theory, the de Moivre–Laplace theorem, which is a special case of the central limit theorem, states that the normal distribution may be used as an approximation to the binomial distribution under certain conditions. In particular, the theorem shows that the probability mass function of the random number of "successes" observed in a series of independent Bernoulli trials, each having probability of success, converges to the probability density function of the normal distribution with mean and standard deviation , as grows large, assuming is not or .

Probability has a dual aspect: on the one hand the likelihood of hypotheses given the evidence for them, and on the other hand the behavior of stochastic processes such as the throwing of dice or coins. The study of the former is historically older in, for example, the law of evidence, while the mathematical treatment of dice began with the work of Cardano, Pascal, Fermat and Christiaan Huygens between the 16th and 17th century.

An Essay towards solving a Problem in the Doctrine of Chances is a work on the mathematical theory of probability by Thomas Bayes, published in 1763, two years after its author's death, and containing multiple amendments and additions due to his friend Richard Price. The title comes from the contemporary use of the phrase "doctrine of chances" to mean the theory of probability, which had been introduced via the title of a book by Abraham de Moivre. Contemporary reprints of the Essay carry a more specific and significant title: A Method of Calculating the Exact Probability of All Conclusions founded on Induction.

The term doctrine of chances is any of several things:

<span class="mw-page-title-main">Ivo Schneider</span>

Ivo Hans Schneider is a German mathematician and historian of mathematics and the natural sciences.

References

  1. 1 2 Schneider, Ivor (2005), "Abraham De Moivre, The Doctrine of Chances (1718, 1738, 1756)", in Grattan-Guinness, I. (ed.), Landmark Writings in Western Mathematics 1640–1940, Amsterdam: Elsevier, pp. 105–120, ISBN   0-444-50871-6 .

Further reading