Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings. [1]
The principle was first proposed by Rolf Landauer in 1961.
Landauer's principle states that the minimum energy needed to erase one bit of information is proportional to the temperature at which the system is operating. Specifically, the energy needed for this computational task is given by
where is the Boltzmann constant and is the temperature in Kelvin. [2] At room temperature, the Landauer limit represents an energy of approximately 0.018 eV (2.9×10−21 J). As of 2012 [update] , modern computers use about a billion times as much energy per operation. [3] [4]
Rolf Landauer first proposed the principle in 1961 while working at IBM. [5] He justified and stated important limits to an earlier conjecture by John von Neumann. For this reason, it is sometimes referred to as being simply the Landauer bound or Landauer limit.
In 2008 and 2009, researchers showed that Landauer's principle can be derived from the second law of thermodynamics and the entropy change associated with information gain, developing the thermodynamics of quantum and classical feedback-controlled systems. [6] [7]
In 2011, the principle was generalized to show that while information erasure requires an increase in entropy, this increase could theoretically occur at no energy cost. [8] Instead, the cost can be taken in another conserved quantity, such as angular momentum.
In a 2012 article published in Nature, a team of physicists from the École normale supérieure de Lyon, University of Augsburg and the University of Kaiserslautern described that for the first time they have measured the tiny amount of heat released when an individual bit of data is erased. [9]
In 2014, physical experiments tested Landauer's principle and confirmed its predictions. [10]
In 2016, researchers used a laser probe to measure the amount of energy dissipation that resulted when a nanomagnetic bit flipped from off to on. Flipping the bit required about 0.026 eV (4.2×10−21 J) at 300 K, which is just 44% above the Landauer minimum. [11]
A 2018 article published in Nature Physics features a Landauer erasure performed at cryogenic temperatures (T = 1 K) on an array of high-spin (S = 10) quantum molecular magnets. The array is made to act as a spin register where each nanomagnet encodes a single bit of information. [12] The experiment has laid the foundations for the extension of the validity of the Landauer principle to the quantum realm. Owing to the fast dynamics and low "inertia" of the single spins used in the experiment, the researchers also showed how an erasure operation can be carried out at the lowest possible thermodynamic cost—that imposed by the Landauer principle—and at a high speed. [12] [1]
The principle is widely accepted as physical law, but it has been challenged for using circular reasoning and faulty assumptions. [13] [14] [15] [16] Others [1] [17] [18] have defended the principle, and Sagawa and Ueda (2008) [6] and Cao and Feito (2009) [7] have shown that Landauer's principle is a consequence of the second law of thermodynamics and the entropy reduction associated with information gain.
On the other hand, recent advances in non-equilibrium statistical physics have established that there is not a prior relationship between logical and thermodynamic reversibility. [19] It is possible that a physical process is logically reversible but thermodynamically irreversible. It is also possible that a physical process is logically irreversible but thermodynamically reversible. At best, the benefits of implementing a computation with a logically reversible system are nuanced. [20]
In 2016, researchers at the University of Perugia claimed to have demonstrated a violation of Landauer’s principle, [21] though their conclusions were disputed. [22]
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change and information systems including the transmission of information in telecommunication.
The holographic principle is a property of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region – such as a light-like boundary like a gravitational horizon. First proposed by Gerard 't Hooft, it was given a precise string theoretic interpretation by Leonard Susskind, who combined his ideas with previous ones of 't Hooft and Charles Thorn. Susskind said, "The three-dimensional world of ordinary experience—the universe filled with galaxies, stars, planets, houses, boulders, and people—is a hologram, an image of reality coded on a distant two-dimensional surface." As pointed out by Raphael Bousso, Thorn observed in 1978, that string theory admits a lower-dimensional description in which gravity emerges from it in what would now be called a holographic way. The prime example of holography is the AdS/CFT correspondence.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, neuroscience, computer science, information theory and sociology. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.
Maxwell's demon is a thought experiment that appears to disprove the second law of thermodynamics. It was proposed by the physicist James Clerk Maxwell in 1867. In his first letter, Maxwell referred to the entity as a "finite being" or a "being who can play a game of skill with the molecules". Lord Kelvin would later call it a "demon".
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium will increase or decrease over a given amount of time. While the second law of thermodynamics predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black hole event horizons. As the study of the statistical mechanics of black-body radiation led to the development of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle.
In the history of science, Laplace's demon was a notable published articulation of causal determinism on a scientific basis by Pierre-Simon Laplace in 1814. According to determinism, if someone knows the precise location and momentum of every atom in the universe, their past and future values for any given time are entailed; they can be calculated from the laws of classical mechanics.
In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.
In physics, the Bekenstein bound is an upper limit on the thermodynamic entropy S, or Shannon entropy H, that can be contained within a given finite region of space which has a finite amount of energy—or conversely, the maximum amount of information required to perfectly describe a given physical system down to the quantum level. It implies that the information of a physical system, or the information necessary to perfectly describe that system, must be finite if the region of space and the energy are finite.
Reversible computing is any model of computation where the computational process, to some extent, is time-reversible. In a model of computation that uses deterministic transitions from one state of the abstract machine to another, a necessary condition for reversibility is that the relation of the mapping from states to their successors must be one-to-one. Reversible computing is a form of unconventional computing.
The limits of computation are governed by a number of different factors. In particular, there are several physical and practical limits to the amount of computation or data storage that can be performed with a given amount of mass, volume, or energy.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
Joan Vaccaro is a physicist at Griffith University and a former student of David Pegg. Her work in quantum physics includes quantum phase, nonclassical states of light, coherent laser excitation of atomic gases, cold atomic gases, stochastic Schrödinger equations, quantum information theory, quantum references, wave–particle duality, quantum thermodynamics, and the physical nature of time.
Entropic gravity, also known as emergent gravity, is a theory in modern physics that describes gravity as an entropic force—a force with macro-scale homogeneity but which is subject to quantum-level disorder—and not a fundamental interaction. The theory, based on string theory, black hole physics, and quantum information theory, describes gravity as an emergent phenomenon that springs from the quantum entanglement of small bits of spacetime information. As such, entropic gravity is said to abide by the second law of thermodynamics under which the entropy of a physical system tends to increase over time.
The theoretical study of time travel generally follows the laws of general relativity. Quantum mechanics requires physicists to solve equations describing how probabilities behave along closed timelike curves (CTCs), which are theoretical loops in spacetime that might make it possible to travel through time.
The Bousso bound captures a fundamental relation between quantum information and the geometry of space and time. It appears to be an imprint of a unified theory that combines quantum mechanics with Einstein's general relativity. The study of black hole thermodynamics and the information paradox led to the idea of the holographic principle: the entropy of matter and radiation in a spatial region cannot exceed the Bekenstein–Hawking entropy of the boundary of the region, which is proportional to the boundary area. However, this "spacelike" entropy bound fails in cosmology; for example, it does not hold true in our universe.
Quantum thermodynamics is the study of the relations between two independent physical theories: thermodynamics and quantum mechanics. The two independent theories address the physical phenomena of light and matter. In 1905, Albert Einstein argued that the requirement of consistency between thermodynamics and electromagnetism leads to the conclusion that light is quantized, obtaining the relation . This paper is the dawn of quantum theory. In a few decades quantum theory became established with an independent set of rules. Currently quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. It differs from quantum statistical mechanics in the emphasis on dynamical processes out of equilibrium. In addition, there is a quest for the theory to be relevant for a single individual quantum system.
Stochastic thermodynamics is an emergent field of research in statistical mechanics that uses stochastic variables to better understand the non-equilibrium dynamics present in many microscopic systems such as colloidal particles, biopolymers, enzymes, and molecular motors.
A molecular demon or biological molecular machine is a biological macromolecule that resembles and seems to have the same properties as Maxwell's demon. These macromolecules gather information in order to recognize their substrate or ligand within a myriad of other molecules floating in the intracellular or extracellular plasm. This molecular recognition represents an information gain which is equivalent to an energy gain or decrease in entropy. When the demon is reset i.e. when the ligand is released, the information is erased, energy is dissipated and entropy increases obeying the second law of thermodynamics. The difference between biological molecular demons and the thought experiment of Maxwell's demon is the latter's apparent violation of the second law.