In the history of science, Laplace's demon was a notable published articulation of causal determinism on a scientific basis by Pierre-Simon Laplace in 1814. [1] According to determinism, if someone (the demon) knows the precise location and momentum of every atom in the universe, their past and future values for any given time are entailed; they can be calculated from the laws of classical mechanics. [2]
Discoveries and theories in the decades following suggest that some elements of Laplace's original writing are wrong or incompatible with our universe. For example, irreversible processes in thermodynamics suggest that Laplace's "demon" could not reconstruct past positions and moments from the current state.[ clarification needed ]
We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past could be present before its eyes.
— Pierre Simon Laplace, A Philosophical Essay on Probabilities [3]
This intellect is often referred to as Laplace's demon (and sometimes Laplace's Superman, after Hans Reichenbach). Laplace himself did not use the word "demon", which was a later embellishment. As translated into English above, he simply referred to: "Une intelligence ... Rien ne serait incertain pour elle, et l'avenir, comme le passé, serait présent à ses yeux." This idea seems to have been widespread around the time that Laplace first expressed it in 1773, particularly in France. Variations can be found in Maupertuis (1756), Nicolas de Condorcet (1768), Baron D'Holbach (1770), and an undated fragment in the archives of Diderot. [4] Recent scholarship suggests that the image of a super-powerful calculating intelligence was also proposed by Roger Joseph Boscovich in his 1758 Theoria philosophiae naturalis. [5]
According to chemical engineer Robert Ulanowicz in his 1986 book Growth and Development, Laplace's demon met its end with early 19th century developments of the concepts of irreversibility, entropy, and the second law of thermodynamics. In other words, Laplace's demon was based on the premise of reversibility and classical mechanics; however, Ulanowicz points out that many thermodynamic processes are irreversible, so that if thermodynamic quantities are taken to be purely physical then no such demon is possible as one could not reconstruct past positions and momenta from the current state.
Maximum entropy thermodynamics takes a very different view, considering thermodynamic variables to have a statistical basis which is separate from the deterministic microscopic physics. [6] However, this theory has met criticism regarding its ability to make predictions about physics; a number of physicists and mathematicians, including Yvan Velenik of the Department of Mathematics for the University of Geneva, have pointed out that maximum entropy thermodynamics essentially describes our knowledge about a system but does not describe the system itself. [7]
Due to its canonical assumption of determinism, Laplace's demon is incompatible with the Copenhagen interpretation, which stipulates indeterminacy. The interpretation of quantum mechanics is still very much open for debate and there are many who take opposing views (such as the many worlds interpretation and the de Broglie–Bohm interpretation). [8]
Chaos theory is sometimes pointed out as a contradiction to Laplace's demon: it describes how a deterministic system can nonetheless exhibit behavior that is impossible to predict: as in the butterfly effect, minor variations between the starting conditions of two systems can result in major differences. [9] While this explains unpredictability in practical cases, applying it to Laplace's case is questionable: under the strict demon hypothesis all details are known—to infinite precision—and therefore variations in starting conditions are non-existent. Put another way: Chaos theory is applicable when knowledge of the system is imperfect, whereas Laplace's demon assumes perfect knowledge of the system, therefore the variability leading to chaos in chaos theory and non-variability in the knowledge of the world Laplace's demon holds are noncomparable.
In 2008, David Wolpert used Cantor diagonalization to challenge the idea of Laplace's demon. He did this by assuming that the demon is a computational device and showed that no two such devices can completely predict each other. [10] [11] Wolpert's paper was cited in 2014 in a paper of Josef Rukavicka, where a significantly simpler argument is presented that disproves Laplace's demon using Turing machines, under the assumption of free will. [12]
In full context, Laplace's demon, as conceived, is infinitely removed from the human mind and thus could never assist humanity's efforts at prediction:
All these efforts in the search for truth tend to lead [the human mind] back continually to the vast intelligence which we have just mentioned, but from which it will always remain infinitely removed.
— Pierre Simon Laplace, A Philosophical Essay on Probabilities [3]
Despite this, the English physicist Stephen Hawking said in his book A Brief History of Time that "Laplace suggested that there should be a set of scientific laws that would allow us to predict everything that would happen in the universe." [13]
Similarly, in James Gleick's book Chaos , the author appears to conflate Laplace's demon with a "dream" for human deterministic predictability, and even states that "Laplace seems almost buffoon-like in his optimism, but much of modern science has pursued his dream" (pg.14).
Recently, Laplace's demon has been invoked to resolve a famous paradox of statistical physics, Loschmidt's paradox. [14] The argument is that, in order to reverse all velocities in a gas system, measurements must be performed by what effectively becomes a Laplace's demon. This, in conjunction with Landauer's principle, allows a way out of the paradox.
There has recently been proposed a limit on the computational power of the universe, i.e. the ability of Laplace's demon to process an infinite amount of information. The limit is based on the maximum entropy of the universe, the speed of light, and the minimum amount of time taken to move information across the Planck length, and the figure was shown to be about 10120 bits. [15] Accordingly, anything that requires more than this amount of data cannot be computed in the amount of time that has elapsed so far in the universe. A simple logical proof of the impossibility of Laplace's idea was advanced in 2012 by Iegor Reznikoff, who posits that the demon cannot predict his own future memory. [16]
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.
The holographic principle is a property of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region – such as a light-like boundary like a gravitational horizon. First proposed by Gerard 't Hooft, it was given a precise string theoretic interpretation by Leonard Susskind, who combined his ideas with previous ones of 't Hooft and Charles Thorn. Susskind said, "The three-dimensional world of ordinary experience—the universe filled with galaxies, stars, planets, houses, boulders, and people—is a hologram, an image of reality coded on a distant two-dimensional surface." As pointed out by Raphael Bousso, Thorn observed in 1978, that string theory admits a lower-dimensional description in which gravity emerges from it in what would now be called a holographic way. The prime example of holography is the AdS/CFT correspondence.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, neuroscience, computer science, information theory and sociology. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.
Determinism is the philosophical view that all events in the universe, including human decisions and actions, are causally inevitable. Deterministic theories throughout the history of philosophy have developed from diverse and sometimes overlapping motives and considerations. Like eternalism, determinism focuses on particular events rather than the future as a concept. The opposite of determinism is indeterminism, or the view that events are not deterministically caused but rather occur due to chance. Determinism is often contrasted with free will, although some philosophers claim that the two are compatible.
Maxwell's demon is a thought experiment that appears to disprove the second law of thermodynamics. It was proposed by the physicist James Clerk Maxwell in 1867. In his first letter, Maxwell referred to the entity as a "finite being" or a "being who can play a game of skill with the molecules". Lord Kelvin would later call it a "demon".
In philosophy, the philosophy of physics deals with conceptual and interpretational issues in physics, many of which overlap with research done by certain kinds of theoretical physicists. Historically, philosophers of physics have engaged with questions such as the nature of space, time, matter and the laws that govern their interactions, as well as the epistemological and ontological basis of the theories used by practicing physicists. The discipline draws upon insights from various areas of philosophy, including metaphysics, epistemology, and philosophy of science, while also engaging with the latest developments in theoretical and experimental physics.
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
Viscount Ilya Romanovich Prigogine was a Belgian physical chemist of Russian-Jewish origin, noted for his work on dissipative structures, complex systems, and irreversibility.
Predictability is the degree to which a correct prediction or forecast of a system's state can be made, either qualitatively or quantitatively.
The arrow of time, also called time's arrow, is the concept positing the "one-way direction" or "asymmetry" of time. It was developed in 1927 by the British astrophysicist Arthur Eddington, and is an unsolved general physics question. This direction, according to Eddington, could be determined by studying the organization of atoms, molecules, and bodies, and might be drawn upon a four-dimensional relativistic map of the world.
In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black hole event horizons. As the study of the statistical mechanics of black-body radiation led to the development of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle.
The heat death of the universe is a hypothesis on the ultimate fate of the universe, which suggests the universe will evolve to a state of no thermodynamic free energy, and will therefore be unable to sustain processes that increase entropy. Heat death does not imply any particular absolute temperature; it only requires that temperature differences or other processes may no longer be exploited to perform work. In the language of physics, this is when the universe reaches thermodynamic equilibrium.
Indeterminism is the idea that events are not caused, or are not caused deterministically.
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.
In physics, Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox, or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.
A physical paradox is an apparent contradiction in physical descriptions of the universe. While many physical paradoxes have accepted resolutions, others defy resolution and may indicate flaws in theory. In physics as in all of science, contradictions and paradoxes are generally assumed to be artifacts of error and incompleteness because reality is assumed to be completely consistent, although this is itself a philosophical assumption. When, as in fields such as quantum physics and relativity theory, existing assumptions about reality have been shown to break down, this has usually been dealt with by changing our understanding of reality to a new one which remains self-consistent in the presence of the new evidence.
In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.
Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.