Specified complexity

Last updated

Specified complexity is a creationist argument introduced by William Dembski, used by advocates to promote the pseudoscience of intelligent design.[ citation needed ] According to Dembski, the concept can formalize a property that singles out patterns that are both specified and complex, where in Dembski's terminology, a specified pattern is one that admits short descriptions, whereas a complex pattern is one that is unlikely to occur by chance. An example cited by Dembski is a poker hand, where for example the repeated appearance of a royal flush will raise suspicion of cheating. [1] Proponents of intelligent design use specified complexity as one of their two main arguments, along with irreducible complexity.

Contents

Dembski argues that it is impossible for specified complexity to exist in patterns displayed by configurations formed by unguided processes. Therefore, Dembski argues, the fact that specified complex patterns can be found in living things indicates some kind of guidance in their formation, which is indicative of intelligence. Dembski further argues that one can show by applying no-free-lunch theorems the inability of evolutionary algorithms to select or generate configurations of high specified complexity. Dembski states that specified complexity is a reliable marker of design by an intelligent agent—a central tenet to intelligent design, which Dembski argues for in opposition to modern evolutionary theory. Specified complexity is what Dembski terms an "explanatory filter": one can recognize design by detecting complex specified information (CSI). Dembski argues that the unguided emergence of CSI solely according to known physical laws and chance is highly improbable. [2]

The concept of specified complexity is widely regarded as mathematically unsound and has not been the basis for further independent work in information theory, in the theory of complex systems, or in biology. [3] [4] [5] A study by Wesley Elsberry and Jeffrey Shallit states: "Dembski's work is riddled with inconsistencies, equivocation, flawed use of mathematics, poor scholarship, and misrepresentation of others' results." [6] Another objection concerns Dembski's calculation of probabilities. According to Martin Nowak, a Harvard professor of mathematics and evolutionary biology, "We cannot calculate the probability that an eye came about. We don't have the information to make the calculation." [7]

Definition

Orgel's terminology

The term "specified complexity" was originally coined by origin of life researcher Leslie Orgel in his 1973 book The Origins of Life: Molecules and Natural Selection, [8] which proposed that RNA could have evolved through Darwinian natural selection. [9] Orgel used the phrase in discussing the differences between life and non-living structures:

In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity. [10]

The phrase was taken up by the creationists Charles Thaxton and Walter L Bradley in a chapter they contributed to the 1994 book The Creation Hypothesis where they discussed "design detection" and redefined "specified complexity" as a way of measuring information. Another contribution to the book was written by William A. Dembski, who took this up as the basis of his subsequent work. [8]

The term was later employed by physicist Paul Davies to qualify the complexity of living organisms:

Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity [11]

Dembski's definition

Whereas Orgel used the term for biological features which are considered in science to have arisen through a process of evolution, Dembski says that it describes features which cannot form through "undirected" evolutionand concludes that it allows one to infer intelligent design. While Orgel employed the concept in a qualitative way, Dembski's use is intended to be quantitative. Dembski's use of the concept dates to his 1998 monograph The Design Inference . Specified complexity is fundamental to his approach to intelligent design, and each of his subsequent books has also dealt significantly with the concept. He has stated that, in his opinion, "if there is a way to detect design, specified complexity is it". [12]

Dembski asserts that specified complexity is present in a configuration when it can be described by a pattern that displays a large amount of independently specified information and is also complex, which he defines as having a low probability of occurrence. He provides the following examples to demonstrate the concept: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified." [13]

In his earlier papers Dembski defined complex specified information (CSI) as being present in a specified event whose probability did not exceed 1 in 10150, which he calls the universal probability bound. In that context, "specified" meant what in later work he called "pre-specified", that is specified by the unnamed designer before any information about the outcome is known. The value of the universal probability bound corresponds to the inverse of the upper limit of "the total number of [possible] specified events throughout cosmic history", as calculated by Dembski. [14] Anything below this bound has CSI. The terms "specified complexity" and "complex specified information" are used interchangeably. In more recent papers Dembski has redefined the universal probability bound, with reference to another number, corresponding to the total number of bit operations that could possibly have been performed in the entire history of the universe.

Dembski asserts that CSI exists in numerous features of living things, such as in DNA and in other functional biological molecules, and argues that it cannot be generated by the only known natural mechanisms of physical law and chance, or by their combination. He argues that this is so because laws can only shift around or lose information, but do not produce it, and because chance can produce complex unspecified information, or simple specified information, but not CSI; he provides a mathematical analysis that he claims demonstrates that law and chance working together cannot generate CSI, either. Moreover, he claims that CSI is holistic, with the whole being greater than the sum of the parts, and that this decisively eliminates Darwinian evolution as a possible means of its "creation". Dembski maintains that by process of elimination, CSI is best explained as being due to intelligence, and is therefore a reliable indicator of design.

Law of conservation of information

Dembski formulates and proposes a law of conservation of information as follows:

This strong proscriptive claim, that natural causes can only transmit CSI but never originate it, I call the Law of Conservation of Information.

Immediate corollaries of the proposed law are the following:

  1. The specified complexity in a closed system of natural causes remains constant or decreases.
  2. The specified complexity cannot be generated spontaneously, originate endogenously or organize itself (as these terms are used in origins-of-life research).
  3. The specified complexity in a closed system of natural causes either has been in the system eternally or was at some point added exogenously (implying that the system, though now closed, was not always closed).
  4. In particular any closed system of natural causes that is also of finite duration received whatever specified complexity it contains before it became a closed system. [15]

Dembski notes that the term "Law of Conservation of Information" was previously used by Peter Medawar in his book The Limits of Science (1984) "to describe the weaker claim that deterministic laws cannot produce novel information." [16] The actual validity and utility of Dembski's proposed law are uncertain; it is neither widely used by the scientific community nor cited in mainstream scientific literature. A 2002 essay by Erik Tellgren provided a mathematical rebuttal of Dembski's law and concludes that it is "mathematically unsubstantiated." [17]

Specificity

In a more recent paper, [18] Dembski provides an account which he claims is simpler and adheres more closely to the theory of statistical hypothesis testing as formulated by Ronald Fisher. In general terms, Dembski proposes to view design inference as a statistical test to reject a chance hypothesis P on a space of outcomes Ω.

Dembski's proposed test is based on the Kolmogorov complexity of a pattern T that is exhibited by an event E that has occurred. Mathematically, E is a subset of Ω, the pattern T specifies a set of outcomes in Ω and E is a subset of T. Quoting Dembski [19]

Thus, the event E might be a die toss that lands six and T might be the composite event consisting of all die tosses that land on an even face.

Kolmogorov complexity provides a measure of the computational resources needed to specify a pattern (such as a DNA sequence or a sequence of alphabetic characters). [20] Given a pattern T, the number of other patterns may have Kolmogorov complexity no larger than that of T is denoted by φ(T). The number φ(T) thus provides a ranking of patterns from the simplest to the most complex. For example, for a pattern T which describes the bacterial flagellum, Dembski claims to obtain the upper bound φ(T) ≤ 1020.

Dembski defines specified complexity of the pattern T under the chance hypothesis P as

where P(T) is the probability of observing the pattern T, R is the number of "replicational resources" available "to witnessing agents". R corresponds roughly to repeated attempts to create and discern a pattern. Dembski then asserts that R can be bounded by 10120. This number is supposedly justified by a result of Seth Lloyd [21] in which he determines that the number of elementary logic operations that can have been performed in the universe over its entire history cannot exceed 10120 operations on 1090 bits.

Dembski's main claim is that the following test can be used to infer design for a configuration: There is a target pattern T that applies to the configuration and whose specified complexity exceeds 1. This condition can be restated as the inequality

Dembski's explanation of specified complexity

Dembski's expression σ is unrelated to any known concept in information theory, though he claims he can justify its relevance as follows: An intelligent agent S witnesses an event E and assigns it to some reference class of events Ω and within this reference class considers it as satisfying a specification T. Now consider the quantity φ(T) × P(T) (where P is the "chance" hypothesis):

Possible targets with complexity ranking and probability not exceeding those of attained target T. Probability of set-theoretic union does not exceed ph(T) x P(T) TargetReplicationalRsources.png
Possible targets with complexity ranking and probability not exceeding those of attained target T. Probability of set-theoretic union does not exceed φ(T) × P(T)

Think of S as trying to determine whether an archer, who has just shot an arrow at a large wall, happened to hit a tiny target on that wall by chance. The arrow, let us say, is indeed sticking squarely in this tiny target. The problem, however, is that there are lots of other tiny targets on the wall. Once all those other targets are factored in, is it still unlikely that the archer could have hit any of them by chance?

In addition, we need to factor in what I call the replicational resources associated with T, that is, all the opportunities to bring about an event of T's descriptive complexity and improbability by multiple agents witnessing multiple events.

According to Dembski, the number of such "replicational resources" can be bounded by "the maximal number of bit operations that the known, observable universe could have performed throughout its entire multi-billion year history", which according to Lloyd is 10120.

However, according to Elsberry and Shallit, "[specified complexity] has not been defined formally in any reputable peer-reviewed mathematical journal, nor (to the best of our knowledge) adopted by any researcher in information theory." [22]

Calculation of specified complexity

Thus far, Dembski's only attempt at calculating the specified complexity of a naturally occurring biological structure is in his book No Free Lunch, for the bacterial flagellum of E. coli. This structure can be described by the pattern "bidirectional rotary motor-driven propeller". Dembski estimates that there are at most 1020 patterns described by four basic concepts or fewer, and so his test for design will apply if

However, Dembski says that the precise calculation of the relevant probability "has yet to be done", although he also claims that some methods for calculating these probabilities "are now in place".

These methods assume that all of the constituent parts of the flagellum must have been generated completely at random, a scenario that biologists do not seriously consider. He justifies this approach by appealing to Michael Behe's concept of "irreducible complexity" (IC), which leads him to assume that the flagellum could not come about by any gradual or step-wise process. The validity of Dembski's particular calculation is thus wholly dependent on Behe's IC concept, and therefore susceptible to its criticisms, of which there are many.

To arrive at the ranking upper bound of 1020 patterns, Dembski considers a specification pattern for the flagellum defined by the (natural language) predicate "bidirectional rotary motor-driven propeller", which he regards as being determined by four independently chosen basic concepts. He furthermore assumes that English has the capability to express at most 105 basic concepts (an upper bound on the size of a dictionary). Dembski then claims that we can obtain the rough upper bound of

for the set of patterns described by four basic concepts or fewer.

From the standpoint of Kolmogorov complexity theory, this calculation is problematic. Quoting Ellsberry and Shallit [23] "Natural language specification without restriction, as Dembski tacitly permits, seems problematic. For one thing, it results in the Berry paradox". These authors add: "We have no objection to natural language specifications per se, provided there is some evident way to translate them to Dembski's formal framework. But what, precisely, is the space of events Ω here?"

Criticism

The soundness of Dembski's concept of specified complexity and the validity of arguments based on this concept are widely disputed. A frequent criticism (see Elsberry and Shallit) is that Dembski has used the terms "complexity", "information" and "improbability" interchangeably. These numbers measure properties of things of different types: Complexity measures how hard it is to describe an object (such as a bitstring), information is how much the uncertainty about the state of an object is reduced by knowing the state of another object or system, [24] and improbability measures how unlikely an event is given a probability distribution.

On page 150 of No Free Lunch Dembski claims he can demonstrate his thesis mathematically: "In this section I will present an in-principle mathematical argument for why natural causes are incapable of generating complex specified information." When Tellgren investigated Dembski's "Law of Conservation of Information” using a more formal approach, he concluded it is mathematically unsubstantiated. [25] Dembski responded in part that he is not "in the business of offering a strict mathematical proof for the inability of material mechanisms to generate specified complexity". [26] Jeffrey Shallit states that Demski's mathematical argument has multiple problems, for example; a crucial calculation on page 297 of No Free Lunch is off by a factor of approximately 1065. [27]

Dembski's calculations show how a simple smooth function cannot gain information. He therefore concludes that there must be a designer to obtain CSI. However, natural selection has a branching mapping from one to many (replication) followed by pruning mapping of the many back down to a few (selection). When information is replicated, some copies can be differently modified while others remain the same, allowing information to increase. These increasing and reductional mappings were not modeled by Dembski. In other words, Dembski's calculations do not model birth and death. This basic flaw in his modeling renders all of Dembski's subsequent calculations and reasoning in No Free Lunch irrelevant because his basic model does not reflect reality. Since the basis of No Free Lunch relies on this flawed argument, the entire thesis of the book collapses. [28]

According to Martin Nowak, a Harvard professor of mathematics and evolutionary biology "We cannot calculate the probability that an eye came about. We don't have the information to make the calculation". [7]

Dembski's critics note that specified complexity, as originally defined by Leslie Orgel, is precisely what Darwinian evolution is supposed to create. Critics maintain that Dembski uses "complex" as most people would use "absurdly improbable". They also claim that his argument is circular: CSI cannot occur naturally because Dembski has defined it thus. They argue that to successfully demonstrate the existence of CSI, it would be necessary to show that some biological feature undoubtedly has an extremely low probability of occurring by any natural means whatsoever, something which Dembski and others have almost never attempted to do. Such calculations depend on the accurate assessment of numerous contributing probabilities, the determination of which is often necessarily subjective. Hence, CSI can at most provide a "very high probability", but not absolute certainty.

Another criticism refers to the problem of "arbitrary but specific outcomes". For example, if a coin is tossed randomly 1000 times, the probability of any particular outcome occurring is roughly one in 10300. For any particular specific outcome of the coin-tossing process, the a priori probability (probability measured before event happens) that this pattern occurred is thus one in 10300, which is astronomically smaller than Dembski's universal probability bound of one in 10150. Yet we know that the post hoc probability (probabilitly as observed after event occurs) of its happening is exactly one, since we observed it happening. This is similar to the observation that it is unlikely that any given person will win a lottery, but, eventually, a lottery will have a winner; to argue that it is very unlikely that any one player would win is not the same as proving that there is the same chance that no one will win. Similarly, it has been argued that "a space of possibilities is merely being explored, and we, as pattern-seeking animals, are merely imposing patterns, and therefore targets, after the fact." [15]

Apart from such theoretical considerations, critics cite reports of evidence of the kind of evolutionary "spontanteous generation" that Dembski claims is too improbable to occur naturally. For example, in 1982, B.G. Hall published research demonstrating that after removing a gene that allows sugar digestion in certain bacteria, those bacteria, when grown in media rich in sugar, rapidly evolve new sugar-digesting enzymes to replace those removed. [29] Another widely cited example is the discovery of nylon eating bacteria that produce enzymes only useful for digesting synthetic materials that did not exist prior to the invention of nylon in 1935.

Other commentators have noted that evolution through selection is frequently used to design certain electronic, aeronautic and automotive systems which are considered problems too complex for human "intelligent designers". [30] This contradicts the argument that an intelligent designer is required for the most complex systems. Such evolutionary techniques can lead to designs that are difficult to understand or evaluate since no human understands which trade-offs were made in the evolutionary process, something which mimics our poor understanding of biological systems.

Dembski's book No Free Lunch was criticised for not addressing the work of researchers who use computer simulations to investigate artificial life. According to Shallit:

The field of artificial life evidently poses a significant challenge to Dembski's claims about the failure of evolutionary algorithms to generate complexity. Indeed, artificial life researchers regularly find their simulations of evolution producing the sorts of novelties and increased complexity that Dembski claims are impossible. [27]

See also

Notes and references

  1. "Specified Complexity Made Simple". 26 February 2024.
  2. Olofsson, P., "Intelligent design and mathematical statistics: a troubled alliance", Biology and Philosophy, (2008) 23: 545. doi : 10.1007/s10539-007-9078-6 (pdf, retrieved December 18, 2017)
  3. Rich Baldwin (2005). "Information Theory and Creationism: William Dembski". TalkOrigins Archive . Retrieved 2010-05-10.
  4. Mark Perakh, (2005). Dembski "displaces Darwinism" mathematically -- or does he?
  5. Jason Rosenhouse, (2001). How Anti-Evolutionists Abuse Mathematics The Mathematical Intelligencer, Vol. 23, No. 4, Fall 2001, pp. 3–8.
  6. Elsberry, Wesley; Shallit, Jeffrey (2003). "Information Theory, Evolutionary Computation, and Dembski's 'Complex Specified Information" (PDF). Retrieved 20 October 2017.
  7. 1 2 Wallis, Claudia (2005). Time Magazine, printed 15 August 2005, page 32
  8. 1 2 "Review: Origins of Life". NCSE. 2015-12-15. Retrieved 1 June 2016.
  9. "Salk Chemical Evolution Scientist Leslie Orgel Dies". Salk Institute for Biological Studies. 30 October 2007. Retrieved 1 June 2016.
  10. Leslie Orgel (1973). The Origins of Life, p. 189.
  11. Paul Davies (1999). The Fifth Miracle p. 112.
  12. William A. Dembski (2002). No Free Lunch , p. 19.
  13. William A. Dembski (1999). Intelligent Design, p. 47.
  14. William A. Dembski (2004). The Design Revolution: Answering the Toughest Questions About Intelligent Design , p. 85.
  15. 1 2 William A. Dembski (1998) Intelligent Design as a Theory of Information.
  16. "Searching Large Spaces: Displacement and the No Free Lunch Regress (356k PDF) Archived 2015-01-04 at the Wayback Machine ", pp. 15-16, describing an argument made by Michael Shermer in How We Believe: Science, Skepticism, and the Search for God, 2nd ed. (2003).
  17. On Dembski's law of conservation of information Erik Tellgren. talkreason.org, 2002. (PDF file)
  18. William A. Dembski (2005). Specification: The Pattern that Signifies intelligence Archived 2007-07-28 at the Wayback Machine
  19. (loc. cit. p. 16)
  20. Michael Sipser (1997). Introduction to the Theory of Computation, PWS Publishing Company.
  21. Lloyd, Seth (2002-05-24). "Computational Capacity of the Universe". Physical Review Letters. 88 (23): 237901. arXiv: quant-ph/0110141 . Bibcode:2002PhRvL..88w7901L. doi:10.1103/physrevlett.88.237901. ISSN   0031-9007. PMID   12059399. S2CID   6341263.
  22. Elsberry & Shallit 2003, p. 14.
  23. Elsberry & Shallit 2003, p. 12.
  24. Adami, Christoph; Ofria, Charles; Collier, Travis (2000). "Evolution of biological complexity". Proceedings of the National Academy of Sciences of the United States of America. 97 (9): 4463–8. arXiv: physics/0005074 . doi: 10.1073/pnas.97.9.4463 . PMC   18257 . PMID   10781045.
  25. Erik Tellgren (June 30, 2002). "On Dembski's Law Of Conservation Of Information" (PDF).
  26. William A. Dembski, (Aug 2002). If Only Darwinists Scrutinized Their Own Work as Closely: A Response to "Erik" Archived 2013-02-26 at the Wayback Machine .
  27. 1 2 Jeffrey Shallit (2002) A review of Dembski's No Free Lunch
  28. Thomas D. Schneider. (2002) Dissecting Dembski's "Complex Specified Information" Archived 2005-10-26 at the Wayback Machine
  29. B.G. Hall (1982). "Evolution of a regulated operon in the laboratory", Genetics , 101(3-4):335-44. In PubMed.
  30. Evolutionary algorithms now surpass human designers New Scientist, 28 July 2007

Related Research Articles

Intelligent design (ID) is a pseudoscientific argument for the existence of God, presented by its proponents as "an evidence-based scientific theory about life's origins". Proponents claim that "certain features of the universe and of living things are best explained by an intelligent cause, not an undirected process such as natural selection." ID is a form of creationism that lacks empirical support and offers no testable or tenable hypotheses, and is therefore not science. The leading proponents of ID are associated with the Discovery Institute, a Christian, politically conservative think tank based in the United States.

Irreducible complexity (IC) is the argument that certain biological systems with multiple interacting parts would not function if one of the parts were removed, so supposedly could not have evolved by successive small modifications from earlier less complex systems through natural selection, which would need all intermediate precursor systems to have been fully functional. This negative argument is then complemented by the claim that the only alternative explanation is a "purposeful arrangement of parts" inferring design by an intelligent agent. Irreducible complexity has become central to the creationist concept of intelligent design (ID), but the concept of irreducible complexity has been rejected by the scientific community, which regards intelligent design as pseudoscience. Irreducible complexity and specified complexity, are the two main arguments used by intelligent-design proponents to support their version of the theological argument from design.

The teleological argument also known as physico-theological argument, argument from design, or intelligent design argument, is a rational argument for the existence of God or, more generally, that complex functionality in the natural world, which looks designed, is evidence of an intelligent creator. The earliest recorded versions of this argument are associated with Socrates in ancient Greece, although it has been argued that he was taking up an older argument. Later, Plato and Aristotle developed complex approaches to the proposal that the cosmos has an intelligent cause, but it was the Stoics during the Roman era who, under their influence, "developed the battery of creationist arguments broadly known under the label 'The Argument from Design'".

<span class="mw-page-title-main">William A. Dembski</span> American mathematician and proponent of intelligent design

William Albert Dembski is an American mathematician, philosopher and theologian. He was a proponent of intelligent design (ID) pseudoscience, specifically the concept of specified complexity, and was a senior fellow of the Discovery Institute's Center for Science and Culture (CSC). On September 23, 2016, he officially retired from intelligent design, resigning all his "formal associations with the ID community, including [his] Discovery Institute fellowship of 20 years". A February 2021 interview in the CSC's blog Evolution News announced "his return to the intelligent design arena".

<span class="mw-page-title-main">Michael Behe</span> American biochemist, author, and intelligent design advocate

Michael Joseph Behe is an American biochemist and an advocate of the pseudoscientific principle of intelligent design (ID).

<span class="mw-page-title-main">Jeffrey Shallit</span> American computer scientist

Jeffrey Outlaw Shallit is an American computer scientist and mathematician. He is an active number theorist and a noted critic of intelligent design. He is married to Anna Lubiw, also a computer scientist.

<span class="mw-page-title-main">Pseudomathematics</span> Work of mathematical cranks

Pseudomathematics, or mathematical crankery, is a mathematics-like activity that does not adhere to the framework of rigor of formal mathematical practice. Common areas of pseudomathematics are solutions of problems proved to be unsolvable or recognized as extremely hard by experts, as well as attempts to apply mathematics to non-quantifiable areas. A person engaging in pseudomathematics is called a pseudomathematician or a pseudomath. Pseudomathematics has equivalents in other scientific fields, and may overlap with other topics characterized as pseudoscience.

<i>Intelligent Design</i> (book) 1999 book by William Dembski

Intelligent Design: The Bridge Between Science and Theology is a 1999 book by the mathematician William A. Dembski, in which the author presents an argument in support of the pseudoscience of intelligent design. Dembski defines the term "specified complexity", and argues that instances of it in nature cannot be explained by Darwinian evolution, but instead are consistent with the intelligent design. He also derives an instance of his self-declared law of conservation of information and uses it to argue against Darwinian evolution. The book is a summary treatment of the mathematical theory he presents in The Design Inference (1998), and is intended to be largely understandable by a nontechnical audience. Dembski also provides a Christian theological commentary, and analysis of, what he perceives to be the historical and cultural significance of the ideas.

An intelligent designer, also referred to as an intelligent agent, is the pseudoscientific hypothetical willed and self-aware entity that the intelligent design movement argues had some role in the origin and/or development of life. The term "intelligent cause" is also used, implying their teleological supposition of direction and purpose in features of the universe and of living things.

<i>The Design Inference</i> 1998 book by William A. Dembski

The Design Inference: Eliminating Chance through Small Probabilities is a 1998 book by American philosopher and mathematician William A. Dembski, a proponent of intelligent design, which sets out to establish approaches by which evidence of intelligent agency could be inferred in natural and social situations. In the book he distinguishes between three general modes of competing explanations in order of priority: regularity, chance, and design. The processes in which regularity, chance, and design are ruled out one by one until one remains as a reasonable and sufficient explanation for an event, are what he calls an "explanatory filter". It is a method that tries to eliminate competing explanations in a systematic fashion including when a highly improbable event conforms to a discernible pattern that is given independently of the event itself. This pattern is Dembski's concept of specified complexity. Throughout the book he uses diverse examples such as detectability of spontaneous generation and occurrence of natural phenomena and cases of deceit like ballot rigging, plagiarism, falsification of data, etc.

A universal probability bound is a probabilistic threshold whose existence is asserted by William A. Dembski and is used by him in his works promoting intelligent design. It is defined as

A degree of improbability below which a specified event of that probability cannot reasonably be attributed to chance regardless of whatever probabilitistic resources from the known universe are factored in.

The junkyard tornado, sometimes known as Hoyle's fallacy, is a fallacious argument formulated by Fred Hoyle against Earth-based abiogenesis and in favor of panspermia. The junkyard tornado argument has been taken out of its original context by theists to argue for intelligent design, and has since become a mainstay in the rejection of evolution by religious groups, even though Fred Hoyle declared himself an atheist, and even though the junkyard tornado argument is considered a fallacy in its original context of Earth-based abiogenesis vs. panspermia.

<i>Uncommon Dissent</i> 2004 anthology edited by William A. Dembski

Uncommon Dissent: Intellectuals Who Find Darwinism Unconvincing is a 2004 anthology edited by William A. Dembski in which fifteen intellectuals, eight of whom are leading intelligent design proponents associated with the Discovery Institute's Center for Science and Culture (CSC) and the International Society for Complexity, Information and Design (ISCID), criticise "Darwinism" and make a case for intelligent design. It is published by the publishing wing of the paleoconservative Intercollegiate Studies Institute. The foreword is by John Wilson, editor of the evangelical Christian magazine Christianity Today. The title is a pun on the principle of biology known as common descent. The Discovery Institute is the engine behind the intelligent design movement.

Objections to evolution have been raised since evolutionary ideas came to prominence in the 19th century. When Charles Darwin published his 1859 book On the Origin of Species, his theory of evolution initially met opposition from scientists with different theories, but eventually came to receive near-universal acceptance in the scientific community. The observation of evolutionary processes occurring has been uncontroversial among mainstream biologists since the 1940s.

The reaction of Jewish leaders and organizations to intelligent design has been primarily concerned with responding to proposals to include intelligent design in public school curricula as a rival scientific hypothesis to modern evolutionary theory.

<i>Why Darwin Matters</i> 2006 book by Michael Shermer

Why Darwin Matters: The Case Against Intelligent Design is a 2006 book by Michael Shermer, an author, publisher, and historian of science. Shermer examines the theory of evolution and the arguments presented against it. He demonstrates that the theory is very robust and is based on a convergence of evidence from a number of different branches of science. The attacks against it are, for the most part, very simplistic and easily demolished. He discusses how evolution and other branches of science can coexist with religious beliefs. He describes how he and Darwin both started out as creationists and how their thinking changed over time. He examines current attitudes towards evolution and science in general. He finds that in many cases the problem people have is not with the facts about evolution but with their ideas of what it implies.

<span class="mw-page-title-main">Nylon-eating bacteria and creationism</span> Religious application of the existence of microorganisms that break down nylon

The discovery of nylon-eating bacteria has been used to refute creationist arguments against evolution and natural selection. These bacteria can produce novel enzymes that allow them to feed on by-products of nylon manufacture which did not exist prior to the invention of nylon in the 1930s. Observation of these adaptations refutes pseudoscientific claims that no new information can be added to a genome and that proteins are too complex to evolve through a process of mutation and natural selection. Apologists have produced reactionary literature attempting to deny that evolution occurs, in turn generating input from the scientific community.

The relationship between intelligent design and science has been a contentious one. Intelligent design (ID) is presented by its proponents as science and claims to offer an alternative to evolution. The Discovery Institute, a politically conservative think tank and the leading proponent of intelligent design, launched a campaign entitled "Teach the Controversy", which claims that a controversy exists within the scientific community over evolution. The scientific community rejects intelligent design as a form of creationism, and the basic facts of evolution are not a matter of controversy in science.

David Hilton Wolpert is an American physicist and computer scientist. He is a professor at Santa Fe Institute. He is the author of three books, three patents, over one hundred refereed papers, and has received two awards. His name is particularly associated with a theorem in computer science known as "no free lunch".