Universal probability bound

Last updated

A universal probability bound is a probabilistic threshold whose existence is asserted by William A. Dembski and is used by him in his works promoting intelligent design. It is defined as

Contents

A degree of improbability below which a specified event of that probability cannot reasonably be attributed to chance regardless of whatever probabilitistic resources from the known universe are factored in. [1]

Dembski asserts that one can effectively estimate a positive value which is a universal probability bound. The existence of such a bound would imply that certain kinds of random events whose probability lies below this value can be assumed not to have occurred in the observable universe, given the resources available in the entire history of the observable universe. Contrapositively, Dembski uses the threshold to argue that the occurrence of certain events cannot be attributed to chance alone. Universal probability bound is then used to argue against evolution.

The idea that events with fantastically small, but positive probabilities, are effectively negligible [2] was discussed by the French mathematician Émile Borel primarily in the context of cosmology and statistical mechanics. [3] However, there is no widely accepted scientific basis for claiming that certain positive values are universal cutoff points for effective negligibility of events. Borel, in particular, was careful to point out that negligibility was relative to a model of probability for a specific physical system. [4] [5]

Dembski appeals to cryptographic practice in support of the concept of the universal probability bound, noting that cryptographers have sometimes compared the security of encryption algorithms against brute force attacks by the likelihood of success of an adversary utilizing computational resources bounded by very large physical constraints. An example of such a constraint might be obtained for example, by assuming that every atom in the observable universe is a computer of a certain type and these computers are running through and testing every possible key. Although universal measures of security are used much less frequently than asymptotic ones [6] and the fact that a keyspace is very large may be less relevant if the cryptographic algorithm used has vulnerabilities which make it susceptible to other kinds of attacks, [7] asymptotic approaches and directed attacks would, by definition, be unavailable under chance-based scenarios such as those relevant to Dembski's universal probability bound. As a result, Dembski's appeal to cryptography is best understood as referring to brute force attacks, rather than directed attacks.

Dembski's estimate

Dembski's original value for the universal probability bound is 1 in 10150, derived as the inverse of the product of the following approximate quantities: [8] [9]

Thus, 10150 = 1080 × 1045 × 1025. Hence, this value corresponds to an upper limit on the number of physical events that could possibly have occurred in the observable part of the universe since the Big Bang.

Dembski has recently (as of 2005) refined his definition to be the inverse of the product of two different quantities: [10]

If the latter quantity equals 10150, then the overall universal probability bound corresponds to the original value.

See also

Related Research Articles

In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time and memory storage requirements. The complexity of a problem is the complexity of the best algorithms that allow solving the problem.

<span class="mw-page-title-main">Quantum information</span> Information held in the state of a quantum system

Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

A pseudorandom number generator (PRNG), also known as a deterministic random bit generator (DRBG), is an algorithm for generating a sequence of numbers whose properties approximate the properties of sequences of random numbers. The PRNG-generated sequence is not truly random, because it is completely determined by an initial value, called the PRNG's seed. Although sequences that are closer to truly random can be generated using hardware random number generators, pseudorandom number generators are important in practice for their speed in number generation and their reproducibility.

This list contains selected positive numbers in increasing order, including counts of things, dimensionless quantities and probabilities. Each number is given a name in the short scale, which is used in English-speaking countries, as well as a name in the long scale, which is used in some of the countries that do not have English as their national language.

In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. This has nothing to do with whether the function is one-to-one; finding any one input with the desired image is considered a successful inversion.

Solomonoff's theory of inductive inference proves that, under its common sense assumptions (axioms), the best possible scientific model is the shortest algorithm that generates the empirical data under consideration. In addition to the choice of data, other assumptions are that, to avoid the post-hoc fallacy, the programming language must be chosen prior to the data and that the environment being observed is generated by an unknown algorithm. This is also called a theory of induction. Due to its basis in the dynamical character of Algorithmic Information Theory, it encompasses statistical as well as dynamical information criteria for model selection. It was introduced by Ray Solomonoff, based on probability theory and theoretical computer science. In essence, Solomonoff's induction derives the posterior probability of any computable theory, given a sequence of observed data. This posterior probability is derived from Bayes' rule and some universal prior, that is, a prior that assigns a positive probability to any computable theory.

In cryptography, a preimage attack on cryptographic hash functions tries to find a message that has a specific hash value. A cryptographic hash function should resist attacks on its preimage.

Specified complexity is a creationist argument introduced by William Dembski, used by advocates to promote the pseudoscience of intelligent design. According to Dembski, the concept can formalize a property that singles out patterns that are both specified and complex, where in Dembski's terminology, a specified pattern is one that admits short descriptions, whereas a complex pattern is one that is unlikely to occur by chance. An example cited by Dembski is a poker hand, where for example the repeated appearance of a royal flush will raise suspicion of cheating. Proponents of intelligent design use specified complexity as one of their two main arguments, along with irreducible complexity.

<i>Intelligent Design</i> (book) 1999 book by William Dembski

Intelligent Design: The Bridge Between Science and Theology is a 1999 book by the mathematician William A. Dembski, in which the author presents an argument in support of the pseudoscience of intelligent design. Dembski defines the term "specified complexity", and argues that instances of it in nature cannot be explained by Darwinian evolution, but instead are consistent with the intelligent design. He also derives an instance of his self-declared law of conservation of information and uses it to argue against Darwinian evolution. The book is a summary treatment of the mathematical theory he presents in The Design Inference (1998), and is intended to be largely understandable by a nontechnical audience. Dembski also provides a Christian theological commentary, and analysis of, what he perceives to be the historical and cultural significance of the ideas.

In theoretical computer science and cryptography, a pseudorandom generator (PRG) for a class of statistical tests is a deterministic procedure that maps a random seed to a longer pseudorandom string such that no statistical test in the class can distinguish between the output of the generator and the uniform distribution. The random seed itself is typically a short binary string drawn from the uniform distribution.

Bremermann's limit, named after Hans-Joachim Bremermann, is a theoretical limit on the maximum rate of computation that can be achieved in a self-contained system in the material universe. It is derived from Einstein's mass–energy equivalency and the Heisenberg uncertainty principle, and is c2/h ≈ 1.3563925 × 1050 bits per second per kilogram.

<i>The Design Inference</i> 1998 book by William A. Dembski

The Design Inference: Eliminating Chance through Small Probabilities is a 1998 book by American philosopher and mathematician William A. Dembski, a proponent of intelligent design, which sets out to establish approaches by which evidence of intelligent agency could be inferred in natural and social situations. In the book he distinguishes between three general modes of competing explanations in order of priority: regularity, chance, and design. The processes in which regularity, chance, and design are ruled out one by one until one remains as a reasonable and sufficient explanation for an event, are what he calls an "explanatory filter". It is a method that tries to eliminate competing explanations in a systematic fashion including when a highly improbable event conforms to a discernible pattern that is given independently of the event itself. This pattern is Dembski's concept of specified complexity. Throughout the book he uses diverse examples such as detectability of spontaneous generation and occurrence of natural phenomena and cases of deceit like ballot rigging, plagiarism, falsification of data, etc.

In cryptography, concrete security or exact security is a practice-oriented approach that aims to give more precise estimates of the computational complexities of adversarial tasks than polynomial equivalence would allow. It quantifies the security of a cryptosystem by bounding the probability of success for an adversary running for a fixed amount of time. Security proofs with precise analyses are referred to as concrete.

In cryptography, a security parameter is a way of measuring of how "hard" it is for an adversary to break a cryptographic scheme. There are two main types of security parameter: computational and statistical, often denoted by and , respectively. Roughly speaking, the computational security parameter is a measure for the input size of the computational problem on which the cryptographic scheme is based, which determines its computational complexity, whereas the statistical security parameter is a measure of the probability with which an adversary can break the scheme.

Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects, such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" the relations or inequalities found in information theory. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."

In cryptography, an adversary's advantage is a measure of how successfully it can attack a cryptographic algorithm, by distinguishing it from an idealized version of that type of algorithm. Note that in this context, the "adversary" is itself an algorithm and not a person. A cryptographic algorithm is considered secure if no adversary has a non-negligible advantage, subject to specified bounds on the adversary's computational resources. "Negligible" usually means "within O(2−p)" where p is a security parameter associated with the algorithm. For example, p might be the number of bits in a block cipher's key.

In mathematics, a negligible function is a function such that for every positive integer c there exists an integer Nc such that for all x > Nc,

A randomness extractor, often simply called an "extractor", is a function, which being applied to output from a weak entropy source, together with a short, uniformly random seed, generates a highly random output that appears independent from the source and uniformly distributed. Examples of weakly random sources include radioactive decay or thermal noise; the only restriction on possible sources is that there is no way they can be fully controlled, calculated or predicted, and that a lower bound on their entropy rate can be established. For a given source, a randomness extractor can even be considered to be a true random number generator (TRNG); but there is no single extractor that has been proven to produce truly random output from any type of weakly random source.

Generic-case complexity is a subfield of computational complexity theory that studies the complexity of computational problems on "most inputs".

Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.

References

  1. ISCID Encyclopedia of Science and Philosophy (1999)
  2. Negligible means having probability zero. Effectively negligible means, roughly, that in some operational sense or in some computational sense, the event is indistinguishable from a negligible one.
  3. Émile Borel, Elements of the Theory of Probability (translated by John Freund), Prentice Hall, 1965, Chapter 6. See also Citations from Borel's articles.
  4. Though Dembski credits Borel for the idea, there is clear evidence that Borel, following accepted scientific practice in the foundations of statistics, was not referring to a universal bound, independent of the statistical model used.
  5. Cobb, L. (2005) Borel's Law and Creationism , Aetheling Consultants.
  6. For a precise definition of effective negligibility in cryptography, see Michael Luby, Pseudorandomness and Cryptographic Applications, Princeton Computer Science Series, 1996.
  7. Though Dembski repeatedly appeals to cryptography in support of the concept of the universal probability bound, in practice cryptographers hardly use measures which are in any way related to it. A more useful concept is that of work factor. See p. 44, A. J. Menezes, P. C. van Oorschot, S. A. Vanstone, Handbook of Applied Cryptography, CRC Press, 1996.
  8. William A. Dembski (1998). The Design Inference pg 213, section 6.5
  9. William A. Dembski (2004). The Design Revolution: Answering the Toughest Questions About Intelligent Design pg 85
  10. William A. Dembski (2005). ""Specification: The Pattern That Signifies Intelligence (382k PDF)".
  11. Lloyd, Seth (2002). "Computational Capacity of the Universe". Physical Review Letters. 88 (23): 237901. arXiv: quant-ph/0110141 . doi:10.1103/PhysRevLett.88.237901. PMID   12059399. S2CID   6341263.
  12. The number 1090 seems to play no role in Dembski's analysis, On page 23 of Specification: The Pattern That Signifies Intelligence, Dembski says
    "Lloyd has shown that 10120constitutes the maximal number of bit operations that the known, observable universe could have performed throughout its entire multi-billion year history."
  13. The rank complexity is Dembski's φ function which ranks patterns in order of their descriptive complexity. See specified complexity.