This page lists events in order of increasing probability, grouped by orders of magnitude. These probabilities were calculated given assumptions detailed in the relevant articles and references. For example, the probabilities of obtaining the different poker hands assume that the cards are dealt fairly.
Factor | SI prefix | Value | Item |
---|---|---|---|
0 | 1.0×10− | Almost never. | |
10−4.5×1029 | 10−4.5×1029 | Probability of a human spontaneously teleporting 50 kilometres (31 miles) due to quantum effects [1] | |
10−360,783 | 1.0×10−360,783 | Probability of a monkey in front of a typewriter typing Hamlet on the first try, taking punctuation, capitalization and spacing into account [2] | |
10−183,800 | 1.0×10−183,800 | Rough first estimate of the probability of a monkey in front of a typewriter typing all the letters of Hamlet on the first try [3] | |
10−67 | 8.07×10−67 | Probability of shuffling a standard 52-card deck in any specific order [4] | |
10−30 | Quecto- (q) | 1×10−30 | One in 1,000,000,000,000,000,000,000,000,000,000 |
10−28 | 4.47×10−28 | Approximate probability of all four players in a game of bridge getting a complete suit [5] | |
10−27 | Ronto- (r) | 1×10−27 | One in 1,000,000,000,000,000,000,000,000,000 |
10−24 | Yocto- (y) | 1×10−24 | One in 1,000,000,000,000,000,000,000,000 |
10−21 | Zepto- (z) | 1×10−21 | One in 1,000,000,000,000,000,000,000 |
10−19 | 2.83×10−19 | Approximate probability of matching 20 numbers for 20 in a game of keno | |
10−18 | Atto- (a) | 1×10−18 | One in 1,000,000,000,000,000,000 |
10−16 | 2.74×10−16 | Probability of rolling snake eyes 10 times in a row on a pair of fair dice | |
10−15 | Femto- (f) | 1×10−15 | One in 1,000,000,000,000,000 |
10−12 | Pico- (p) | 1×10−12 | One in 1,000,000,000,000 |
10−11 | 2.52×10−11 | Approximate probability of one player in a game of bridge getting a complete suit | |
10−10 | 2.0×10−10 | Probability per second of a SATA harddisk failure during an MTBF test [6] | |
5.25×10−10 | Caesium-137 atom decay each second [7] | ||
9.9×10−10 | Gaussian distribution: probability of a value being more than 6 standard deviations from the mean on a specific side [8] | ||
10−9 | Nano- (n) | 1×10−9 | One in 1,000,000,000 |
3.9×10−9 | Probability of an entry winning the jackpot in the Mega Millions multi-state lottery in the United States* [9] | ||
5.707×10−9 | Probability of winning the Grand Prize (matching all 6 numbers) in the US Powerball lottery with one ticket in January 2014 | ||
10−8 | 1.303×10−8 | Probability of winning the Grand Prize (matching all 6 numbers) in the Australian Powerball lottery with one ticket in March 2013 | |
2.219×10−8 | odds of winning the Jackpot (matching the 6 main numbers from 59) in the UK National Lottery with one ticket since 10 October 2015 | ||
7.151×10−8 | odds of winning the Jackpot (matching the 6 main numbers from 49) in the UK National Lottery with one ticket until 10 October 2015 | ||
10−7 | 1.17×10−7 | Death per aircraft journey [10] | |
2.9×10−7 | Gaussian distribution: probability of a value being more than 5 standard deviations from the mean on a specific side [11] | ||
8.0×10−7 | Death per person per year by lightning strike in Germany (Europe) [12] | ||
10−6 | Micro- (μ) | 1×10−6 | Life-threatening adverse reaction from a measles vaccine [13] |
1.43×10−6 | Probability of the Yellowstone supervolcano erupting in a given year. | ||
1.5×10−6 | Probability of being dealt a royal flush in poker | ||
10−5 | 1.4×10−5 | Probability of being dealt a straight flush (other than a royal flush) in poker | |
1.6×10−5 | Risk that the asteroid 2013 TV135 which is 450 meters wide [14] will impact earth in 2032 [15] | ||
3.2×10−5 | Gaussian distribution: probability of a value being more than 4 standard deviations from the mean on a specific side [16] | ||
8.43×10−5 | Probability of a deadly vehicle accident per person in Europe each year (not including former Yugoslavia)[ citation needed ][ when? ] | ||
10−4 | 2.4×10−4 | Probability of being dealt a four of a kind in poker | |
10−3 | Milli- (m) | 1.3×10−3 | Gaussian distribution: probability of a value being more than 3 standard deviations from the mean on a specific side [17] |
1.4×10−3 | Probability of a human birth giving triplets or higher-order multiples [18] | ||
Probability of being dealt a full house in poker | |||
1.9×10−3 | Probability of being dealt a flush in poker | ||
2.7×10−3 | Probability of a random day of the year being your birthday (for all birthdays besides Feb. 29) | ||
4×10−3 | Probability of being dealt a straight in poker | ||
10−2 | Centi- (c) | 1.8×10−2 | Probability of winning any prize in the UK National Lottery with one ticket in 2003 |
2.1×10−2 | Probability of being dealt a three of a kind in poker | ||
2.3×10−2 | Gaussian distribution: probability of a value being more than 2 standard deviations from the mean on a specific side [17] | ||
2.7×10−2 | Probability of winning any prize in the Powerball with one ticket in 2006 | ||
3.3×10−2 | Probability of a human giving birth to twins [19] | ||
4.8×10−2 | Probability of being dealt a two pair in poker | ||
10−1 | Deci- (d) | 1.6×10−1 | Gaussian distribution: probability of a value being more than 1 standard deviation from the mean on a specific side [20] |
1.7×10−1 | Chance of rolling a '6' on a six-sided die | ||
4.2×10−1 | Probability of being dealt only one pair in poker | ||
5.0×10−1 | Chance of getting a 'head' in a coin toss. Physically less than 0.5; approximately 4.9983×10−1 for US nickel accounting for 1.67×10−4 (1-in-6000 chance) of coin landing on its edge. [21] | ||
Probability of being dealt no pair in poker | |||
100 | 1×100 | Certain |
The uncertainty principle, also known as Heisenberg's indeterminacy principle, is a fundamental concept in quantum mechanics. It states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. In other words, the more accurately one property is measured, the less accurately the other property can be known.
In probability theory, the central limit theorem (CLT) states that, under appropriate conditions, the distribution of a normalized version of the sample mean converges to a standard normal distribution. This holds even if the original variables themselves are not normally distributed. There are several versions of the CLT, each applying in the context of different conditions.
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables, each of which clusters around a mean value.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are supposed properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
The Elo rating system is a method for calculating the relative skill levels of players in zero-sum games such as chess or esports. It is named after its creator Arpad Elo, a Hungarian-American physics professor.
In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values. It is a convenient and useful model for measurements in exact and engineering sciences, as well as medicine, economics and other topics (e.g., energies, concentrations, lengths, prices of financial instruments, and other metrics).
In mathematics, a random walk, sometimes known as a drunkard's walk, is a stochastic process that describes a path that consists of a succession of random steps on some mathematical space.
The Ising model, named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent magnetic dipole moments of atomic "spins" that can be in one of two states. The spins are arranged in a graph, usually a lattice, allowing each spin to interact with its neighbors. Neighboring spins that agree have a lower energy than those that disagree; the system tends to the lowest energy but heat disturbs this tendency, thus creating the possibility of different structural phases. The model allows the identification of phase transitions as a simplified model of reality. The two-dimensional square-lattice Ising model is one of the simplest statistical models to show a phase transition.
In statistics, propagation of uncertainty is the effect of variables' uncertainties on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations which propagate due to the combination of variables in the function.
In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. The result can be either a continuous or a discrete distribution.
Linear discriminant analysis (LDA), normal discriminant analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics and other fields, to find a linear combination of features that characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later classification.
Sample size determination or estimation is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a population from a sample. In practice, the sample size used in a study is usually determined based on the cost, time, or convenience of collecting the data, and the need for it to offer sufficient statistical power. In complex studies, different sample sizes may be allocated, such as in stratified surveys or experimental designs with multiple treatment groups. In a census, data is sought for an entire population, hence the intended sample size is equal to the population. In experimental design, where a study may be divided into different treatment groups, there may be different sample sizes for each group.
The contact process is a stochastic process used to model population growth on the set of sites of a graph in which occupied sites become vacant at a constant rate, while vacant sites become occupied at a rate proportional to the number of occupied neighboring sites. Therefore, if we denote by the proportionality constant, each site remains occupied for a random time period which is exponentially distributed parameter 1 and places descendants at every vacant neighboring site at times of events of a Poisson process parameter during this period. All processes are independent of one another and of the random period of time sites remains occupied. The contact process can also be interpreted as a model for the spread of an infection by thinking of particles as a bacterium spreading over individuals that are positioned at the sites of , occupied sites correspond to infected individuals, whereas vacant correspond to healthy ones.
The Lorenz system is a system of ordinary differential equations first studied by mathematician and meteorologist Edward Lorenz. It is notable for having chaotic solutions for certain parameter values and initial conditions. In particular, the Lorenz attractor is a set of chaotic solutions of the Lorenz system. The term "butterfly effect" in popular media may stem from the real-world implications of the Lorenz attractor, namely that tiny changes in initial conditions evolve to completely different trajectories. This underscores that chaotic systems can be completely deterministic and yet still be inherently impractical or even impossible to predict over longer periods of time. For example, even the small flap of a butterfly's wings could set the earth's atmosphere on a vastly different trajectory, in which for example a hurricane occurs where it otherwise would have not. The shape of the Lorenz attractor itself, when plotted in phase space, may also be seen to resemble a butterfly.
In statistics, the generalized Pareto distribution (GPD) is a family of continuous probability distributions. It is often used to model the tails of another distribution. It is specified by three parameters: location , scale , and shape . Sometimes it is specified by only scale and shape and sometimes only by its shape parameter. Some references give the shape parameter as .
In probability and statistics, the truncated normal distribution is the probability distribution derived from that of a normally distributed random variable by bounding the random variable from either below or above. The truncated normal distribution has wide applications in statistics and econometrics.
In statistics, dispersion is the extent to which a distribution is stretched or squeezed. Common examples of measures of statistical dispersion are the variance, standard deviation, and interquartile range. For instance, when the variance of data in a set is large, the data is widely scattered. On the other hand, when the variance is small, the data in the set is clustered.
In the mathematical theory of random matrices, the Marchenko–Pastur distribution, or Marchenko–Pastur law, describes the asymptotic behavior of singular values of large rectangular random matrices. The theorem is named after soviet mathematicians Volodymyr Marchenko and Leonid Pastur who proved this result in 1967.
The KLM scheme or KLM protocol is an implementation of linear optical quantum computing (LOQC) developed in 2000 by Emanuel Knill, Raymond Laflamme and Gerard J. Milburn. This protocol allows for the creation of universal quantum computers using solely linear optical tools. The KLM protocol uses linear optical elements, single-photon sources and photon detectors as resources to construct a quantum computation scheme involving only ancilla resources, quantum teleportations and error corrections.
This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.
With 1.4 million hours MTBF
z>6 ... 9.866 x 10^-10
1 in 258,890,850
z>5 ... 2.867 x 10^-7
z>4 ... 3.167 x 10^-5
2.0 0.02275 ... 3.0 0.00134
Triplet or higher order birth rate: 137.0 per 100,000 live births
Twin birth rate: 33.2 per 1,000 live births
15.87% of all instances fall to the left of —1 standard deviation