History of probability

Last updated

Probability has a dual aspect: on the one hand the likelihood of hypotheses given the evidence for them, and on the other hand the behavior of stochastic processes such as the throwing of dice or coins. The study of the former is historically older in, for example, the law of evidence, while the mathematical treatment of dice began with the work of Cardano, Pascal, Fermat and Christiaan Huygens between the 16th and 17th century.

Contents

Probability deals with random experiments with a known distribution, Statistics deals with inference from the data about the unknown distribution.

Etymology

Probable and probability and their cognates in other modern languages derive from medieval learned Latin probabilis, deriving from Cicero and generally applied to an opinion to mean plausible or generally approved. [1] The form probability is from Old French probabilite (14 c.) and directly from Latin probabilitatem (nominative probabilitas) "credibility, probability," from probabilis (see probable). The mathematical sense of the term is from 1718. In the 18th century, the term chance was also used in the mathematical sense of "probability" (and probability theory was called Doctrine of Chances). This word is ultimately from Latin cadentia, i.e. "a fall, case". The English adjective likely is of Germanic origin, most likely from Old Norse likligr (Old English had geliclic with the same sense), originally meaning "having the appearance of being strong or able" "having the similar appearance or qualities", with a meaning of "probably" recorded mid-15 c. The derived noun likelihood had a meaning of "similarity, resemblance" but took on a meaning of "probability" from the mid 15th century. The meaning "something likely to be true" is from 1570s.

Origins

Ancient and medieval law of evidence developed a grading of degrees of proof, credibility, presumptions and half-proof to deal with the uncertainties of evidence in court. [2]

In Renaissance times, betting was discussed in terms of odds such as "ten to one" and maritime insurance premiums were estimated based on intuitive risks, but there was no theory on how to calculate such odds or premiums. [3]

The mathematical methods of probability arose in the investigations first of Gerolamo Cardano in the 1560s (not published until 100 years later), and then in the correspondence Pierre de Fermat and Blaise Pascal (1654) on such questions as the fair division of the stake in an interrupted game of chance. Christiaan Huygens (1657) gave a comprehensive treatment of the subject. [4] [5]

In ancient times there were games played using astragali, or talus bone. [6] The pottery of ancient Greece provides evidence to show that the astragali were tossed into a circle drawn on the floor, much like playing marbles. In Egypt, excavators of tombs found a game they called "Hounds and Jackals", which closely resembles the modern game snakes and ladders. According to Pausanias, [7] Palamedes invented dice during the Trojan wars, although their true origin is uncertain. The first dice game mentioned in literature of the Christian era was called hazard. Played with two or three dice, it was probably brought to Europe by the knights returning from the Crusades. Dante Alighieri (1265–1321) mentions this game. A commenter of Dante puts further thought into this game: the thought was that with three dice, the lowest number you can get is three, an ace for every die. Achieving a four can be done with three dice by having a two on one die and aces on the other two dice. [8]

Cardano also thought about the sum of three dice. At face value there are the same number of combinations that sum to 9 as those that sum to 10. For a 9:(621) (531) (522) (441) (432) (333) and for 10: (631) (622) (541) (532) (442) (433). However, there are more ways of obtaining some of these combinations than others. For example, if we consider the order of results there are six ways to obtain (621): (1,2,6), (1,6,2), (2,1,6), (2,6,1), (6,1,2), (6,2,1), but there is only one way to obtain (333), where the first, second and third dice all roll 3. There are a total of 27 permutations that sum to 10 but only 25 that sum to 9. From this, Cardano found that the probability of throwing a 9 is less than that of throwing a 10. He also demonstrated the efficacy of defining odds as the ratio of favourable to unfavourable outcomes (which implies that the probability of an event is given by the ratio of favourable outcomes to the total number of possible outcomes). [9] [10]

Seventeenth century

In addition, Galileo wrote about die-throwing sometime between 1613 and 1623. Unknowingly considering what is essentially the same problem as Cardano's, Galileo had said that certain numbers have the ability to be thrown because there are more ways to create that number. [11]

The date which historians cite as the beginning of the development of modern probability theory is 1654, when two of the most well-known mathematicians of the time, Blaise Pascal and Pierre de Fermat, began a correspondence discussing the subject. The two initiated the communication because earlier that year, a gambler from Paris named Antoine Gombaud had sent Pascal and other mathematicians several questions on the practical applications of some of these theories; in particular he posed the problem of points, concerning a theoretical two-player game in which a prize must be divided between the players due to external circumstances halting the game. The fruits of Pascal and Fermat's correspondence interested other mathematicians, including Christiaan Huygens, whose De ratiociniis in aleae ludo (Calculations in Games of Chance) appeared in 1657 as the final chapter of Van Schooten's Exercitationes Matematicae. In 1665 Pascal posthumously published his results on the eponymous Pascal's triangle, an important combinatorial concept. He referred to the triangle in his work Traité du triangle arithmétique (Traits of the Arithmetic Triangle) as the "arithmetic triangle". [12]

In 1662, the book La Logique ou l’Art de Penser was published anonymously in Paris. [13] The authors presumably were Antoine Arnauld and Pierre Nicole, two leading Jansenists, who worked together with Blaise Pascal. The Latin title of this book is Ars cogitandi, which was a successful book on logic of the time. The Ars cogitandi consists of four books, with the fourth one dealing with decision-making under uncertainty by considering the analogy to gambling and introducing explicitly the concept of a quantified probability. [14] [15]

In the field of statistics and applied probability, John Graunt published Natural and Political Observations Made upon the Bills of Mortality also in 1662, initiating the discipline of demography. This work, among other things, gave a statistical estimate of the population of London, produced the first life table, gave probabilities of survival of different age groups, examined the different causes of death, noting that the annual rate of suicide and accident is constant, and commented on the level and stability of sex ratio. [16] The usefulness and interpretation of Graunt's tables were discussed in a series of correspondences by brothers Ludwig and Christiaan Huygens in 1667, where they realized the difference between mean and median estimates and Christian even interpolated Graunt's life table by a smooth curve, creating the first continuous probability distribution; but their correspondences were not published. Later, Johan de Witt, the then prime minister of the Dutch Republic, published similar material in his 1671 work Waerdye van Lyf-Renten (A Treatise on Life Annuities), which used statistical concepts to determine life expectancy for practical political purposes; a demonstration of the fact that this sampling branch of mathematics had significant pragmatic applications. [17] De Witt's work was not widely distributed beyond the Dutch Republic, perhaps due to his fall from power and execution by mob in 1672. Apart from the practical contributions of these two work, they also exposed a fundamental idea that probability can be assigned to events that do not have inherent physical symmetry, such as the chances of dying at certain age, unlike say the rolling of a dice or flipping of a coin, simply by counting the frequency of occurrence. Thus, probability could be more than mere combinatorics. [15]

Eighteenth century

Jacob Bernoulli's Ars Conjectandi (posthumous, 1713) and Abraham De Moivre's The Doctrine of Chances (1718) put probability on a sound mathematical footing, showing how to calculate a wide range of complex probabilities. Bernoulli proved a version of the fundamental law of large numbers, which states that in a large number of trials, the average of the outcomes is likely to be very close to the expected value - for example, in 1000 throws of a fair coin, it is likely that there are close to 500 heads (and the larger the number of throws, the closer to half-and-half the proportion is likely to be).

Nineteenth century

The power of probabilistic methods in dealing with uncertainty was shown by Gauss's determination of the orbit of Ceres from a few observations. The theory of errors used the method of least squares to correct error-prone observations, especially in astronomy, based on the assumption of a normal distribution of errors to determine the most likely true value. In 1812, Laplace issued his Théorie analytique des probabilités in which he consolidated and laid down many fundamental results in probability and statistics such as the moment-generating function, method of least squares, inductive probability, and hypothesis testing.

Towards the end of the nineteenth century, a major success of explanation in terms of probabilities was the statistical mechanics of Ludwig Boltzmann and J. Willard Gibbs which explained properties of gases such as temperature in terms of the random motions of large numbers of particles.

The field of the history of probability itself was established by Isaac Todhunter's monumental A History of the Mathematical Theory of Probability from the Time of Pascal to that of Laplace (1865).

Twentieth century

Probability and statistics became closely connected through the work on hypothesis testing of R. A. Fisher and Jerzy Neyman, which is now widely applied in biological and psychological experiments and in clinical trials of drugs, as well as in economics and elsewhere. A hypothesis, for example that a drug is usually effective, gives rise to a probability distribution that would be observed if the hypothesis is true. If observations approximately agree with the hypothesis, it is confirmed, if not, the hypothesis is rejected. [18]

The theory of stochastic processes broadened into such areas as Markov processes and Brownian motion, the random movement of tiny particles suspended in a fluid. That provided a model for the study of random fluctuations in stock markets, leading to the use of sophisticated probability models in mathematical finance, including such successes as the widely used Black–Scholes formula for the valuation of options. [19]

The twentieth century also saw long-running disputes on the interpretations of probability. In the mid-century frequentism was dominant, holding that probability means long-run relative frequency in a large number of trials. At the end of the century there was some revival of the Bayesian view, according to which the fundamental notion of probability is how well a proposition is supported by the evidence for it.

The mathematical treatment of probabilities, especially when there are infinitely many possible outcomes, was facilitated by Kolmogorov's axioms (1933).

References

  1. Franklin (2001), pp. 113, 126.
  2. Franklin (2001).
  3. Franklin (2001), pp. 278–288.
  4. Hacking (2006). For Cardano, see p. 54; for Fermat and Pascal, see pp. 59–61; for Huygens, see pp. 92–94
  5. Franklin (2001), pp. 296–316.
  6. David, F. N. (1969). Games, gods and gambling: the origins and history of probability and statistical ideas from the earliest times to the Newtonian era. London: Griffin. ISBN   978-0-85264-171-2.
  7. Pausanias (2007). Description of Greece. 1: Books I and II. The Loeb classical library (Repr. ed.). Cambridge, Mass.: Harvard Univ. Press. ISBN   978-0-674-99104-0.
  8. Franklin (2001), pp. 293–294.
  9. Some laws and problems in classical probability and how Cardano anticipated them Gorrochum, P. Chance magazine 2012
  10. Franklin (2001), pp. 296–300.
  11. Franklin (2001), p. 302.
  12. "Blaise Pascal", Encyclopædia Britannica Online, Encyclopædia Britannica Inc., 2008, retrieved 2008-05-23
  13. Shafer 1996
  14. Collani 2006
  15. 1 2 Hacking 1971
  16. Ian Sutherland (1963), "John Graunt: A Tercentenary Tribute", Journal of the Royal Statistical Society, Series A, 126 (4): 537–556, doi:10.2307/2982578, JSTOR   2982578
  17. Brakel 1976 , p. 123
  18. Salsburg (2001).
  19. Bernstein (1996), Chapter 18.

Sources