Universal probability bound

Last updated

A universal probability bound is a probabilistic threshold whose existence is asserted by William A. Dembski and is used by him in his works promoting intelligent design. It is defined as

Contents

A degree of improbability below which a specified event of that probability cannot reasonably be attributed to chance regardless of whatever probabilitistic resources from the known universe are factored in. [1]

Dembski asserts that one can effectively estimate a positive value which is a universal probability bound. The existence of such a bound would imply that certain kinds of random events whose probability lies below this value can be assumed not to have occurred in the observable universe, given the resources available in the entire history of the observable universe. Contrapositively, Dembski uses the threshold to argue that the occurrence of certain events cannot be attributed to chance alone. Universal probability bound is then used to argue against random evolution. However evolution is not based on random events only (genetic drift), but also on natural selection.

The idea that events with fantastically small, but positive probabilities, are effectively negligible [2] was discussed by the French mathematician Émile Borel primarily in the context of cosmology and statistical mechanics. [3] However, there is no widely accepted scientific basis for claiming that certain positive values are universal cutoff points for effective negligibility of events. Borel, in particular, was careful to point out that negligibility was relative to a model of probability for a specific physical system. [4] [5]

Dembski appeals to cryptographic practice in support of the concept of the universal probability bound, noting that cryptographers have sometimes compared the security of encryption algorithms against brute force attacks by the likelihood of success of an adversary utilizing computational resources bounded by very large physical constraints. An example of such a constraint might be obtained for example, by assuming that every atom in the observable universe is a computer of a certain type and these computers are running through and testing every possible key. Although universal measures of security are used much less frequently than asymptotic ones [6] and the fact that a keyspace is very large may be less relevant if the cryptographic algorithm used has vulnerabilities which make it susceptible to other kinds of attacks, [7] asymptotic approaches and directed attacks would, by definition, be unavailable under chance-based scenarios such as those relevant to Dembski's universal probability bound. As a result, Dembski's appeal to cryptography is best understood as referring to brute force attacks, rather than directed attacks.

Dembski's estimate

Dembski's original value for the universal probability bound is 1 in 10150, derived as the inverse of the product of the following approximate quantities: [8] [9]

Thus, 10150 = 1080 × 1045 × 1025. Hence, this value corresponds to an upper limit on the number of physical events that could possibly have occurred in the observable part of the universe since the Big Bang.

Dembski has recently (as of 2005) refined his definition to be the inverse of the product of two different quantities: [10]

If the latter quantity equals 10150, then the overall universal probability bound corresponds to the original value.

See also

Related Research Articles

In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time and memory storage requirements. The complexity of a problem is the complexity of the best algorithms that allow solving the problem.

<span class="mw-page-title-main">Quantum information</span> Information held in the state of a quantum system

Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

<span class="mw-page-title-main">Infinite monkey theorem</span> Counterintuitive result in probability

The infinite monkey theorem states that a monkey hitting keys at random on a typewriter keyboard for an infinite amount of time will almost surely type any given text, such as the complete works of William Shakespeare. In fact, the monkey would almost surely type every possible finite text an infinite number of times. However, the probability that monkeys filling the entire observable universe would type a single complete work, such as Shakespeare's Hamlet, is so tiny that the chance of it occurring during a period of time hundreds of thousands of orders of magnitude longer than the age of the universe is extremely low (but technically not zero). The theorem can be generalized to state that any sequence of events which has a non-zero probability of happening will almost certainly eventually occur, given enough time.

A pseudorandom number generator (PRNG), also known as a deterministic random bit generator (DRBG), is an algorithm for generating a sequence of numbers whose properties approximate the properties of sequences of random numbers. The PRNG-generated sequence is not truly random, because it is completely determined by an initial value, called the PRNG's seed. Although sequences that are closer to truly random can be generated using hardware random number generators, pseudorandom number generators are important in practice for their speed in number generation and their reproducibility.

This list contains selected positive numbers in increasing order, including counts of things, dimensionless quantities and probabilities. Each number is given a name in the short scale, which is used in English-speaking countries, as well as a name in the long scale, which is used in some of the countries that do not have English as their national language.

In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. Not being one-to-one is not considered sufficient for a function to be called one-way.

Solomonoff's theory of inductive inference is a mathematical proof that if a universe is generated by an algorithm, then observations of that universe, encoded as a dataset, are best predicted by the smallest executable archive of that dataset. This formalization of Occam's razor for induction was introduced by Ray Solomonoff, based on probability theory and theoretical computer science. In essence, Solomonoff's induction derives the posterior probability of any computable theory, given a sequence of observed data. This posterior probability is derived from Bayes rule and some universal prior, that is, a prior that assigns a positive probability to any computable theory.

In cryptography, a semantically secure cryptosystem is one where only negligible information about the plaintext can be feasibly extracted from the ciphertext. Specifically, any probabilistic, polynomial-time algorithm (PPTA) that is given the ciphertext of a certain message , and the message's length, cannot determine any partial information on the message with probability non-negligibly higher than all other PPTA's that only have access to the message length. This concept is the computational complexity analogue to Shannon's concept of perfect secrecy. Perfect secrecy means that the ciphertext reveals no information at all about the plaintext, whereas semantic security implies that any information revealed cannot be feasibly extracted.

In cryptography, a preimage attack on cryptographic hash functions tries to find a message that has a specific hash value. A cryptographic hash function should resist attacks on its preimage.

<span class="mw-page-title-main">Specified complexity</span> Creationist argument by William Dembski

Specified complexity is a creationist argument introduced by William Dembski, used by advocates to promote the pseudoscience of intelligent design. According to Dembski, the concept can formalize a property that singles out patterns that are both specified and complex, where in Dembski's terminology, a specified pattern is one that admits short descriptions, whereas a complex pattern is one that is unlikely to occur by chance. Proponents of intelligent design use specified complexity as one of their two main arguments, alongside irreducible complexity.

<i>Intelligent Design</i> (book) 1999 book by William Dembski

Intelligent Design: The Bridge Between Science and Theology is a 1999 book by the mathematician William A. Dembski, in which the author presents an argument in support of intelligent design. Dembski defines the term "specified complexity", and argues that instances of it in nature cannot be explained by Darwinian evolution, but instead are consistent with the intelligent design. He also derives an instance of his self-declared law of conservation of information and uses it to argue against Darwinian evolution. The book is a summary treatment of the mathematical theory he presents in The Design Inference (1998), and is intended to be largely understandable by a nontechnical audience. Dembski also provides a Christian theological commentary, and analysis of, what he perceives to be the historical and cultural significance of the ideas.

In theoretical computer science and cryptography, a pseudorandom generator (PRG) for a class of statistical tests is a deterministic procedure that maps a random seed to a longer pseudorandom string such that no statistical test in the class can distinguish between the output of the generator and the uniform distribution. The random seed itself is typically a short binary string drawn from the uniform distribution.

<i>The Design Inference</i>

The Design Inference: Eliminating Chance through Small Probabilities is a 1998 book by American philosopher and mathematician William A. Dembski, a proponent of intelligent design, which sets out to establish approaches by which evidence of intelligent agency could be inferred in natural and social situations. In the book he distinguishes between 3 general modes of competing explanations in order of priority: regularity, chance, and design. The processes in which regularity, chance, and design are ruled out one by one until one remains as a reasonable and sufficient explanation for an event, are what he calls an "explanatory filter". It is a method that tries to eliminate competing explanations in a systematic fashion including when a highly improbable event conforms to a discernible pattern that is given independently of the event itself. This pattern is Dembski's concept of specified complexity. Throughout the book he uses diverse examples such as detectability of spontaneous generation and occurrence of natural phenomena and cases of deceit like ballot rigging, plagiarism, falsification of data, etc.

In cryptography, concrete security or exact security is a practice-oriented approach that aims to give more precise estimates of the computational complexities of adversarial tasks than polynomial equivalence would allow. It quantifies the security of a cryptosystem by bounding the probability of success for an adversary running for a fixed amount of time. Security proofs with precise analyses are referred to as concrete.

In cryptography, a security parameter is a way of measuring of how "hard" it is for an adversary to break a cryptographic scheme. There are two main types of security parameter: computational and statistical, often denoted by and , respectively. Roughly speaking, the computational security parameter is a measure for the input size of the computational problem on which the cryptographic scheme is based, which determines its computational complexity, whereas the statistical security parameter is a measure of the probability with which an adversary can break the scheme.

In cryptography, an adversary's advantage is a measure of how successfully it can attack a cryptographic algorithm, by distinguishing it from an idealized version of that type of algorithm. Note that in this context, the "adversary" is itself an algorithm and not a person. A cryptographic algorithm is considered secure if no adversary has a non-negligible advantage, subject to specified bounds on the adversary's computational resources. "Negligible" usually means "within O(2−p)" where p is a security parameter associated with the algorithm. For example, p might be the number of bits in a block cipher's key.

The junkyard tornado, also known as Hoyle's fallacy, is an argument used to deride the probability of abiogenesis as comparable to "the chance that a tornado sweeping through a junkyard might assemble a Boeing 747." It was used originally by English astronomer Fred Hoyle (1915–2001), who applied statistical analysis to the origin of life, but similar observations predate Hoyle and have been found all the way back to Darwin's time, and indeed to Cicero in classical times. While Hoyle himself was an atheist, the argument has since become a mainstay in the rejection of evolution by religious groups.

In mathematics, a negligible function is a function such that for every positive integer c there exists an integer Nc such that for all x > Nc,

Generic-case complexity is a subfield of computational complexity theory that studies the complexity of computational problems on "most inputs".

Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.

References

  1. ISCID Encyclopedia of Science and Philosophy (1999)
  2. Negligible means having probability zero. Effectively negligible means, roughly, that in some operational sense or in some computational sense, the event is indistinguishable from a negligible one.
  3. Émile Borel, Elements of the Theory of Probability (translated by John Freund), Prentice Hall, 1965, Chapter 6. See also Citations from Borel's articles.
  4. Though Dembski credits Borel for the idea, there is clear evidence that Borel, following accepted scientific practice in the foundations of statistics, was not referring to a universal bound, independent of the statistical model used.
  5. Cobb, L. (2005) Borel's Law and Creationism , Aetheling Consultants.
  6. For a precise definition of effective negligibility in cryptography, see Michael Luby, Pseudorandomness and Cryptographic Applications, Princeton Computer Science Series, 1996.
  7. Though Dembski repeatedly appeals to cryptography in support of the concept of the universal probability bound, in practice cryptographers hardly use measures which are in any way related to it. A more useful concept is that of work factor. See p. 44, A. J. Menezes, P. C. van Oorschot, S. A. Vanstone, Handbook of Applied Cryptography, CRC Press, 1996.
  8. William A. Dembski (1998). The Design Inference pg 213, section 6.5
  9. William A. Dembski (2004). The Design Revolution: Answering the Toughest Questions About Intelligent Design pg 85
  10. William A. Dembski (2005). ""Specification: The Pattern That Signifies Intelligence (382k PDF)".
  11. Lloyd, Seth (2002). "Computational Capacity of the Universe". Physical Review Letters. 88 (23): 237901. arXiv: quant-ph/0110141 . doi:10.1103/PhysRevLett.88.237901. PMID   12059399. S2CID   6341263.
  12. The number 1090 seems to play no role in Dembski's analysis, On page 23 of Specification: The Pattern That Signifies Intelligence, Dembski says
    "Lloyd has shown that 10120constitutes the maximal number of bit operations that the known, observable universe could have performed throughout its entire multi-billion year history."
  13. The rank complexity is Dembski's φ function which ranks patterns in order of their descriptive complexity. See specified complexity.