This article needs additional citations for verification .(October 2008) |

The following is a timeline of **probability** and **statistics**.

- 8th century – Al-Khalil, an Arab mathematician studying cryptology, wrote the
*Book of Cryptographic Messages*. The work has been lost, but based on the reports of later authors, it contained the first use of permutations and combinations to list all possible Arabic words with and without vowels.^{ [1] } - 9th century - Al-Kindi was the first to use frequency analysis to decipher encrypted messages and developed the first code breaking algorithm. He wrote a book entitled
*Manuscript on Deciphering Cryptographic Messages*, containing detailed discussions on statistics and cryptanalysis.^{ [2] }^{ [3] }^{ [4] }Al-Kindi also made the earliest known use of statistical inference.^{ [1] } - 13th century – An important contribution of Ibn Adlan was on sample size for use of frequency analysis.
^{ [1] } - 13th century – the first known calculation of the probability for throwing 3 dices is published in the Latin poem
*De vetula*. - 1560s (published 1663) – Cardano's
*Liber de ludo aleae*attempts to calculate probabilities of dice throws. He demonstrates the efficacy of defining odds as the ratio of favourable to unfavourable outcomes (which implies that the probability of an event is given by the ratio of favourable outcomes to the total number of possible outcomes^{ [5] }). - 1577 – Bartolomé de Medina defends probabilism, the view that in ethics one may follow a probable opinion even if the opposite is more probable

- 1654 – Blaise Pascal and Pierre de Fermat create the mathematical theory of probability,
- 1657 – Chistiaan Huygens's
*De ratiociniis in ludo aleae*is the first book on mathematical probability, - 1662 – John Graunt's
*Natural and Political Observations Made upon the Bills of Mortality*makes inferences from statistical data on deaths in London, - 1666 – In Le Journal des Sçavans xxxi, 2 August 1666 (359–370(=364)) appears a review of the third edition (1665) of John Graunt's Observations on the Bills of Mortality. This review gives a summary of 'plusieurs reflexions curieuses', of which the second are Graunt's data on life expectancy. This review is used by Nicolaus Bernoulli in his De Usu Artis Conjectandi in Jure (1709).
- 1669 – Christiaan Huygens and his brother Lodewijk discuss between August and December that year Graunts mortality table (Graunt 1662, p. 62) in letters #1755
- 1693 – Edmond Halley prepares the first mortality tables statistically relating death rate to age,

- 1710 – John Arbuthnot argues that the constancy of the ratio of male to female births is a sign of divine providence,
- 1713 – Posthumous publication of Jacob Bernoulli's
*Ars Conjectandi*, containing the first derivation of a law of large numbers, - 1724 – Abraham de Moivre studies mortality statistics and the foundation of the theory of annuities in
*Annuities upon Lives*, - 1733 – de Moivre introduces the normal distribution to approximate the binomial distribution in probability,
- 1739 – David Hume's
*Treatise of Human Nature*argues that inductive reasoning is unjustified, - 1761 – Thomas Bayes proves Bayes' theorem,
- 1786 – William Playfair's
*Commercial and Political Atlas*introduces graphs and bar charts of data,

- 1801 – Carl Friedrich Gauss predicts the orbit of Ceres using a line of best fit
- 1805 – Adrien-Marie Legendre introduces the method of least squares for fitting a curve to a given set of observations,
- 1814 – Pierre-Simon Laplace's
*Essai philosophique sur les probabilités*defends a definition of probabilities in terms of equally possible cases, introduces generating functions and Laplace transforms, uses conjugate priors for exponential families, proves an early version of the Bernstein–von Mises theorem on the asymptotic irrelevance of prior distributions on the limiting posterior distribution and the role of the Fisher information on asymptotically normal posterior modes. - 1835 – Adolphe Quetelet's
*Treatise on Man*introduces social science statistics and the concept of the "average man", - 1866 – John Venn's
*Logic of Chance*defends the frequency interpretation of probability. - 1877–1883 – Charles Sanders Peirce outlines frequentist statistics, emphasizing the use of objective randomization in experiments and in sampling. Peirce also invented an optimally designed experiment for regression.
- 1880 – Thorvald N. Thiele gives a mathematical analysis of Brownian motion, introduces the likelihood function, and invents cumulants.
- 1888 – Francis Galton introduces the concept of correlation,
- 1900 – Louis Bachelier analyzes stock price movements as a stochastic process,

- 1908 – Student's t-distribution for the mean of small samples published in English (following earlier derivations in German).
- 1913 – Michel Plancherel states fundamental results in ergodic theory.
- 1920 – The central limit theorem in its modern form was formally stated.
- 1921 – John Maynard Keynes'
*Treatise on Probability*defends a logical interpretation of probability. Sewall Wright develops path analysis.^{ [6] } - 1928 – L. H. C. Tippett and Ronald Fisher introduce extreme value theory,
- 1933 – Andrey Nikolaevich Kolmogorov publishes his book
*Basic notions of the calculus of probability*(*Grundbegriffe der Wahrscheinlichkeitsrechnung*) which contains an axiomatization of probability based on measure theory, - 1935 – Fisher's
*Design of Experiments*(1st ed), - 1937 – Jerzy Neyman introduces the concept of confidence interval in statistical testing,
- 1941 – Due to the World War II, research on detection theory started, leading to the receiver operating characteristic
- 1946 – Cox's theorem derives the axioms of probability from simple logical assumptions,
- 1948 – Claude Shannon's
*Mathematical Theory of Communication*defines capacity of communication channels in terms of probabilities, - 1953 – Nicholas Metropolis introduces the idea of thermodynamic simulated annealing methods

**Bayesian probability** is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.

**Frequentist probability** or **frequentism** is an interpretation of probability; it defines an event's probability as the limit of its relative frequency in many trials. Probabilities can be found by a repeatable objective process. The continued use of frequentist methods in scientific inference, however, has been called into question.

**Probability** is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty. The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes are both equally probable; the probability of 'heads' equals the probability of 'tails'; and since no other outcomes are possible, the probability of either 'heads' or 'tails' is 1/2.

**Statistics** is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industrial, or social problem, it is conventional to begin with a statistical population or a statistical model to be studied. Populations can be diverse groups of people or objects such as "all people living in a country" or "every atom composing a crystal". Statistics deals with every aspect of data, including the planning of data collection in terms of the design of surveys and experiments.

**Statistical inference** is the process of using data analysis to infer properties of an underlying distribution of probability. **Inferential statistical analysis** infers properties of a population, for example by **testing hypotheses** and deriving estimates. It is assumed that the observed data set is sampled from a larger population.

In cryptography, a **Caesar cipher**, also known as **Caesar's cipher**, the **shift cipher**, **Caesar's code**, or **Caesar shift**, is one of the simplest and most widely known encryption techniques. It is a type of substitution cipher in which each letter in the plaintext is replaced by a letter some fixed number of positions down the alphabet. For example, with a left shift of 3, D would be replaced by A, E would become B, and so on. The method is named after Julius Caesar, who used it in his private correspondence.

**Bayesian inference** is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

**Abraham de Moivre** FRS was a French mathematician known for de Moivre's formula, a formula that links complex numbers and trigonometry, and for his work on the normal distribution and probability theory.

* The Doctrine of Chances* was the first textbook on probability theory, written by 18th-century French mathematician Abraham de Moivre and first published in 1718. De Moivre wrote in English because he resided in England at the time, having fled France to escape the persecution of Huguenots. The book's title came to be synonymous with

**Abū Yūsuf Yaʻqūb ibn ʼIsḥāq aṣ-Ṣabbāḥ al-Kindī** was an Arab Muslim polymath active as a philosopher, mathematician, physician, and music theorist. Al-Kindi was the first of the Islamic peripatetic philosophers, and is hailed as the "father of Arab philosophy".

**Bayesian statistics** is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a *degree of belief* in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation that views probability as the limit of the relative frequency of an event after many trials.

The **classical definition** or **interpretation of probability** is identified with the works of Jacob Bernoulli and Pierre-Simon Laplace. As stated in Laplace's *Théorie analytique des probabilités*,

Statistics, in the modern sense of the word, began evolving in the 18th century in response to the novel needs of industrializing sovereign states.

** Ars Conjectandi** is a book on combinatorics and mathematical probability written by Jacob Bernoulli and published in 1713, eight years after his death, by his nephew, Niklaus Bernoulli. The seminal work consolidated, apart from many combinatorial topics, many central ideas in probability theory, such as the very first version of the law of large numbers: indeed, it is widely regarded as the founding work of that subject. It also addressed problems that today are classified in the twelvefold way and added to the subjects; consequently, it has been dubbed an important historical landmark in not only probability but all combinatorics by a plethora of mathematical historians. The importance of this early work had a large impact on both contemporary and later mathematicians; for example, Abraham de Moivre.

This is a timeline of pure and applied mathematics history. It is divided here into three stages, corresponding to stages in the development of mathematical notation: a "rhetorical" stage in which calculations are described purely by words, a "syncopated" stage in which quantities and common algebraic operations are beginning to be represented by symbolic abbreviations, and finally a "symbolic" stage, in which comprehensive notational systems for formulas are the norm.

Probability has a dual aspect: on the one hand the likelihood of hypotheses given the evidence for them, and on the other hand the behavior of stochastic processes such as the throwing of dice or coins. The study of the former is historically older in, for example, the law of evidence, while the mathematical treatment of dice began with the work of Cardano, Pascal, Fermat and Christiaan Huygens between the 16th and 17th century.

* Essay d'analyse sur les jeux de hazard* is a book on combinatorics and mathematical probability written by Pierre Remond de Montmort and published in 1708. The work applied ideas from combinatorics and probability to analyse various games of chances popular during the time. This book was mainly influenced by Christiaan Huygens' treatise

- 1 2 3 Broemeling, Lyle D. (1 November 2011). "An Account of Early Statistical Inference in Arab Cryptology".
*The American Statistician*.**65**(4): 255–257. doi:10.1198/tas.2011.10191. - ↑ Singh, Simon (2000).
*The code book : the science of secrecy from ancient Egypt to quantum cryptography*(1st Anchor Books ed.). New York: Anchor Books. ISBN 0-385-49532-3. - ↑ Singh, Simon (2000).
*The code book : the science of secrecy from ancient Egypt to quantum cryptography*(1st Anchor Books ed.). New York: Anchor Books. ISBN 978-0-385-49532-5. - ↑ Ibrahim A. Al-Kadi "The origins of cryptology: The Arab contributions",
*Cryptologia*, 16(2) (April 1992) pp. 97–126. - ↑
*Some laws and problems in classical probability and how Cardano anticipated them*Gorrochum, P.*Chance*magazine 2012 - ↑ Wright, Sewall (1921). "Correlation and causation".
*Journal of Agricultural Research*.**20**(7): 557–585.

- Kees Verduin (2007),
*A Short History of Probability and Statistics* - John Aldrich (2008),
*Figures from the History of Probability and Statistics* - John Aldrich (2008),
*Probability and Statistics on the Earliest Uses Pages* - Michael Friendly and Daniel J. Denis (2008). "Milestones in the History of Thematic Cartography, Statistical Graphics, and Data Visualization: An illustrated chronology of innovations".

This page is based on this Wikipedia article

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.