Orders of magnitude (probability)

Last updated

This page lists events in order of increasing probability, grouped by orders of magnitude. These probabilities were calculated given assumptions detailed in the relevant articles and references. For example, the probabilities of obtaining the different poker hands assume that the cards are dealt fairly.

List of orders of magnitude for probability
Factor SI prefix ValueItem
01.0×10Almost never.
10−4.5×102910−4.5×1029Probability of a human spontaneously teleporting 50 kilometres (31 miles) due to quantum effects [1]
10−360,7831.0×10−360,783Probability of a monkey in front of a typewriter typing Hamlet on the first try, taking punctuation, capitalization and spacing into account [2]
10−183,8001.0×10−183,800Rough first estimate of the probability of a monkey in front of a typewriter typing all the letters of Hamlet on the first try [3]
10−678.07×10−67Probability of shuffling a standard 52-card deck in any specific order [4]
10−30 Quecto- (q)1×10−30One in 1,000,000,000,000,000,000,000,000,000,000
10−284.47×10−28Approximate probability of all four players in a game of bridge getting a complete suit [5]
10−27 Ronto- (r)1×10−27One in 1,000,000,000,000,000,000,000,000,000
10−24 Yocto- (y)1×10−24One in 1,000,000,000,000,000,000,000,000
10−21 Zepto- (z)1×10−21One in 1,000,000,000,000,000,000,000
10−192.83×10−19Approximate probability of matching 20 numbers for 20 in a game of keno
10−18 Atto- (a)1×10−18One in 1,000,000,000,000,000,000
10−162.74×10−16Probability of rolling snake eyes 10 times in a row on a pair of fair dice
10−15 Femto- (f)1×10−15One in 1,000,000,000,000,000
10−12 Pico- (p)1×10−12One in 1,000,000,000,000
10−112.52×10−11Approximate probability of one player in a game of bridge getting a complete suit
10−102.0×10−10Probability per second of a SATA harddisk failure during an MTBF test [6]
5.25×10−10 Caesium-137 atom decay each second [7]
9.9×10−10 Gaussian distribution: probability of a value being more than 6 standard deviations from the mean on a specific side [8]
10−9 Nano- (n)1×10−9One in 1,000,000,000
3.9×10−9Probability of an entry winning the jackpot in the Mega Millions multi-state lottery in the United States* [9]
5.707×10−9Probability of winning the Grand Prize (matching all 6 numbers) in the US Powerball lottery with one ticket in January 2014
10−81.303×10−8Probability of winning the Grand Prize (matching all 6 numbers) in the Australian Powerball lottery with one ticket in March 2013
2.219×10−8odds of winning the Jackpot (matching the 6 main numbers from 59) in the UK National Lottery with one ticket since 10 October 2015
7.151×10−8odds of winning the Jackpot (matching the 6 main numbers from 49) in the UK National Lottery with one ticket until 10 October 2015
10−71.17×10−7Death per aircraft journey [10]
2.9×10−7 Gaussian distribution: probability of a value being more than 5 standard deviations from the mean on a specific side [11]
8.0×10−7Death per person per year by lightning strike in Germany (Europe) [12]
10−6 Micro- (µ)1×10−6Life-threatening adverse reaction from a measles vaccine [13]
1.43×10−6Probability of the Yellowstone supervolcano erupting in a given year.
1.5×10−6Probability of being dealt a royal flush in poker
10−51.4×10−5Probability of being dealt a straight flush (other than a royal flush) in poker
1.6×10−5Risk that the asteroid 2013 TV135 which is 450 meters wide [14] will impact earth in 2032 [15]
3.2×10−5 Gaussian distribution: probability of a value being more than 4 standard deviations from the mean on a specific side [16]
8.43×10−5Probability of a deadly vehicle accident per person in Europe each year (not including former Yugoslavia)[ citation needed ][ when? ]
10−42.4×10−4Probability of being dealt a four of a kind in poker
10−3 Milli- (m)1.3×10−3 Gaussian distribution: probability of a value being more than 3 standard deviations from the mean on a specific side [17]
1.4×10−3Probability of a human birth giving triplets or higher-order multiples [18]
Probability of being dealt a full house in poker
1.9×10−3Probability of being dealt a flush in poker
2.7×10−3Probability of a random day of the year being your birthday (for all birthdays besides Feb. 29)
4×10−3Probability of being dealt a straight in poker
10−2 Centi- (c)1.8×10−2Probability of winning any prize in the UK National Lottery with one ticket in 2003
2.1×10−2Probability of being dealt a three of a kind in poker
2.3×10−2 Gaussian distribution: probability of a value being more than 2 standard deviations from the mean on a specific side [17]
2.7×10−2Probability of winning any prize in the Powerball with one ticket in 2006
3.3×10−2Probability of a human giving birth to twins [19]
4.8×10−2Probability of being dealt a two pair in poker
10−1 Deci- (d)1.6×10−1 Gaussian distribution: probability of a value being more than 1 standard deviation from the mean on a specific side [20]
1.7×10−1Chance of rolling a '6' on a six-sided die
4.2×10−1Probability of being dealt only one pair in poker
5.0×10−1Chance of getting a 'head' in a coin toss. Physically less than 0.5; approximately 4.9983×10−1 for US nickel accounting for 1.67×10−4 (1-in-6000 chance) of coin landing on its edge. [21]
Probability of being dealt no pair in poker
1001×100 Certain

Related Research Articles

<span class="mw-page-title-main">Uncertainty principle</span> Foundational principle in quantum physics

The uncertainty principle, also known as Heisenberg's indeterminacy principle, is a fundamental concept in quantum mechanics. It states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. In other words, the more accurately one property is measured, the less accurately the other property can be known.

In probability theory, the central limit theorem (CLT) establishes that, in many situations, for independent and identically distributed random variables, the sampling distribution of the standardized sample mean tends towards the standard normal distribution even if the original variables themselves are not normally distributed.

<span class="mw-page-title-main">Multivariate normal distribution</span> Generalization of the one-dimensional normal distribution to higher dimensions

In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables each of which clusters around a mean value.

Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are putative properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."

<span class="mw-page-title-main">Elo rating system</span> Method for calculating the relative skill levels of players in zero-sum games such as chess

The Elo rating system is a method for calculating the relative skill levels of players in zero-sum games such as chess. It is named after its creator Arpad Elo, a Hungarian-American physics professor.

<span class="mw-page-title-main">Log-normal distribution</span> Probability distribution

In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values. It is a convenient and useful model for measurements in exact and engineering sciences, as well as medicine, economics and other topics (e.g., energies, concentrations, lengths, prices of financial instruments, and other metrics).

<span class="mw-page-title-main">Random walk</span> Mathematical formalization of a path that consists of a succession of random steps

In mathematics, a random walk, sometimes known as a drunkard's walk, is a random process that describes a path that consists of a succession of random steps on some mathematical space.

The Ising model, named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent magnetic dipole moments of atomic "spins" that can be in one of two states. The spins are arranged in a graph, usually a lattice, allowing each spin to interact with its neighbors. Neighboring spins that agree have a lower energy than those that disagree; the system tends to the lowest energy but heat disturbs this tendency, thus creating the possibility of different structural phases. The model allows the identification of phase transitions as a simplified model of reality. The two-dimensional square-lattice Ising model is one of the simplest statistical models to show a phase transition.

In statistics, propagation of uncertainty is the effect of variables' uncertainties on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations which propagate due to the combination of variables in the function.

In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. The result can be either a continuous or a discrete distribution.

Linear discriminant analysis (LDA), normal discriminant analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics and other fields, to find a linear combination of features that characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later classification.

Sample size determination is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a population from a sample. In practice, the sample size used in a study is usually determined based on the cost, time, or convenience of collecting the data, and the need for it to offer sufficient statistical power. In complicated studies there may be several different sample sizes: for example, in a stratified survey there would be different sizes for each stratum. In a census, data is sought for an entire population, hence the intended sample size is equal to the population. In experimental design, where a study may be divided into different treatment groups, there may be different sample sizes for each group.

<span class="mw-page-title-main">Contact process (mathematics)</span>

The contact process is a stochastic process used to model population growth on the set of sites of a graph in which occupied sites become vacant at a constant rate, while vacant sites become occupied at a rate proportional to the number of occupied neighboring sites. Therefore, if we denote by the proportionality constant, each site remains occupied for a random time period which is exponentially distributed parameter 1 and places descendants at every vacant neighboring site at times of events of a Poisson process parameter during this period. All processes are independent of one another and of the random period of time sites remains occupied. The contact process can also be interpreted as a model for the spread of an infection by thinking of particles as a bacterium spreading over individuals that are positioned at the sites of , occupied sites correspond to infected individuals, whereas vacant correspond to healthy ones.

<span class="mw-page-title-main">Lorenz system</span> System of ordinary differential equations with chaotic solutions

The Lorenz system is a system of ordinary differential equations first studied by mathematician and meteorologist Edward Lorenz. It is notable for having chaotic solutions for certain parameter values and initial conditions. In particular, the Lorenz attractor is a set of chaotic solutions of the Lorenz system. In popular media the "butterfly effect" stems from the real-world implications of the Lorenz attractor, namely that several different initial chaotic conditions evolve in phase space in a way that never repeats, so all chaos is unpredictable. This underscores that chaotic systems can be completely deterministic and yet still be inherently unpredictable over long periods of time. Because chaos continually increases in systems, we cannot predict the future of systems well. E.g., even the small flap of a butterfly’s wings could set the world on a vastly different trajectory, such as by causing a hurricane. The shape of the Lorenz attractor itself, when plotted in phase space, may also be seen to resemble a butterfly.

<span class="mw-page-title-main">Generalized Pareto distribution</span> Family of probability distributions often used to model tails or extreme values

In statistics, the generalized Pareto distribution (GPD) is a family of continuous probability distributions. It is often used to model the tails of another distribution. It is specified by three parameters: location , scale , and shape . Sometimes it is specified by only scale and shape and sometimes only by its shape parameter. Some references give the shape parameter as .

In probability theory and statistics, partial correlation measures the degree of association between two random variables, with the effect of a set of controlling random variables removed. When determining the numerical relationship between two variables of interest, using their correlation coefficient will give misleading results if there is another confounding variable that is numerically related to both variables of interest. This misleading information can be avoided by controlling for the confounding variable, which is done by computing the partial correlation coefficient. This is precisely the motivation for including other right-side variables in a multiple regression; but while multiple regression gives unbiased results for the effect size, it does not give a numerical value of a measure of the strength of the relationship between the two variables of interest.

<span class="mw-page-title-main">Truncated normal distribution</span> Type of probability distribution

In probability and statistics, the truncated normal distribution is the probability distribution derived from that of a normally distributed random variable by bounding the random variable from either below or above. The truncated normal distribution has wide applications in statistics and econometrics.

<span class="mw-page-title-main">Marchenko–Pastur distribution</span> Distribution of singular values of large rectangular random matrices

In the mathematical theory of random matrices, the Marchenko–Pastur distribution, or Marchenko–Pastur law, describes the asymptotic behavior of singular values of large rectangular random matrices. The theorem is named after Soviet mathematicians Volodymyr Marchenko and Leonid Pastur who proved this result in 1967.

The KLM scheme or KLM protocol is an implementation of linear optical quantum computing (LOQC), developed in 2000 by Emanuel Knill, Raymond Laflamme, and Gerard J. Milburn. This protocol allows for the creation of universal quantum computers using solely linear optical tools. The KLM protocol uses linear optical elements, single-photon sources, and photon detectors as resources to construct a quantum computation scheme involving only ancilla resources, quantum teleportations, and error corrections.

This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.

References

  1. De Bianchi, Massimiliano Sassoli (2017-12-25), On the quantum "self-teleportation" probability of a human body, Brussels Free University, pp. 1–9, arXiv: 1712.08465 , Bibcode:2017arXiv171208465S
  2. There are around 130,000 letters and 199,749 total characters in Hamlet; 26 letters ×2 for capitalization, 12 for punctuation characters = 64, 64199749 10360,783.
  3. Kittel, Charles and Herbert Kroemer (1980). Thermal Physics (2nd ed.). W. H. Freeman Company. p. 53. ISBN   0-7167-1088-9.
  4. Robert Matthews. "What are the odds of shuffling a deck of cards into the right order?". Science Focus. Retrieved December 10, 2018.
  5. Bridge hands
  6. "WD VelociRaptor Drive Specification Sheet (PDF)" (PDF). 2012-04-13. Retrieved 2013-11-22. With 1.4 million hours MTBF
  7. "NIST Radionuclide Half-Life Measurements". Nist. 2011-12-09. Retrieved 2013-09-10.
  8. "6 sigma". Wolfram Alpha. Retrieved 2013-12-07. z>6 ... 9.866 x 10^-10
  9. "How to Play". Mega Millions. Retrieved 2013-11-20. 1 in 258,890,850
  10. Informed Sources Archive Archived 2014-05-22 at the Wayback Machine Alycidon Rail web site. Retrieved 29 April 2009. The site cites the source as an October 2000 article by Roger Ford in the magazine Modern Railways and based on a DETR survey.
  11. "5 sigma" . Retrieved 2013-12-07. z>5 ... 2.867 x 10^-7
  12. "Annual rates of lightning fatalities by country" (PDF). Ronald L. Holle, Holle Meteorology & Photography – 2008. 2010-08-12. p. 7. Retrieved 2013-09-10.
  13. Galindo, B. M.; Concepción, D.; Galindo, M. A.; Pérez, A.; Saiz, J. (2012). "Vaccine-related adverse events in Cuban children, 1999–2008". MEDICC Review. 14 (1): 38–43. doi: 10.37757/MR2012V14.N1.8 . PMID   22334111.
  14. "Earth Impact Risk Summary: 2013 TV135 (Nov 7 arc=25 days)". archive.is: JPL. 2013-11-07. Archived from the original on 2013-11-07. Retrieved 2013-11-08. (5.9e-09 = 1 in 169,492,000 chance)
  15. "No, the Earth (Almost Certainly) Won't Get Hit by an Asteroid in 2032". Slate.com. 2013-10-18. Retrieved 2013-10-18.
  16. "4 sigma" . Retrieved 2013-12-07. z>4 ... 3.167 x 10^-5
  17. 1 2 "Extreme Normal Probabilities" . Retrieved 2013-12-07. 2.0 0.02275 ... 3.0 0.00134
  18. "Multiple Births". CDC. Retrieved 2013-11-22. Triplet or higher order birth rate: 137.0 per 100,000 live births
  19. "Multiple Births". CDC. Retrieved 2013-11-22. Twin birth rate: 33.2 per 1,000 live births
  20. "Introduction to Procedures Involving Sample Means" . Retrieved 2013-12-07. 15.87% of all instances fall to the left of —1 standard deviation
  21. Murray, Daniel B.; Teare, Scott W. (1993-10-01). "Probability of a tossed coin landing on edge". Physical Review E. 48 (4): 2547–2552. Bibcode:1993PhRvE..48.2547M. doi:10.1103/PhysRevE.48.2547. PMID   9960889.