This article may be too technical for most readers to understand.(September 2010) |
The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium (i.e., maximum entropy) will increase or decrease over a given amount of time. While the second law of thermodynamics predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
Roughly, the fluctuation theorem relates to the probability distribution of the time-averaged irreversible entropy production, denoted . The theorem states that, in systems away from equilibrium over a finite time t, the ratio between the probability that takes on a value A and the probability that it takes the opposite value, −A, will be exponential in At. In other words, for a finite non-equilibrium system in a finite time, the FT gives a precise mathematical expression for the probability that entropy will flow in a direction opposite to that dictated by the second law of thermodynamics.
Mathematically, the FT is expressed as:
This means that as the time or system size increases (since is extensive), the probability of observing an entropy production opposite to that dictated by the second law of thermodynamics decreases exponentially. The FT is one of the few expressions in non-equilibrium statistical mechanics that is valid far from equilibrium.
Note that the FT does not state that the second law of thermodynamics is wrong or invalid. The second law of thermodynamics is a statement about macroscopic systems. The FT is more general. It can be applied to both microscopic and macroscopic systems. When applied to macroscopic systems, the FT is equivalent to the Second Law of Thermodynamics.
The FT was first proposed and tested using computer simulations, by Denis Evans, E.G.D. Cohen and Gary Morriss in 1993. [1] The first derivation was given by Evans and Debra Searles in 1994. Since then, much mathematical and computational work has been done to show that the FT applies to a variety of statistical ensembles. The first laboratory experiment that verified the validity of the FT was carried out in 2002. In this experiment, a plastic bead was pulled through a solution by a laser. Fluctuations in the velocity were recorded that were opposite to what the second law of thermodynamics would dictate for macroscopic systems. [2] [3] [4] [5] In 2020, observations at high spatial and spectral resolution of the solar photosphere have shown that solar turbulent convection satisfies the symmetries predicted by the fluctuation relation at a local level. [6]
A simple consequence of the fluctuation theorem given above is that if we carry out an arbitrarily large ensemble of experiments from some initial time t=0, and perform an ensemble average of time averages of the entropy production, then an exact consequence of the FT is that the ensemble average cannot be negative for any value of the averaging time t:
This inequality is called the second law inequality. [7] This inequality can be proved for systems with time dependent fields of arbitrary magnitude and arbitrary time dependence.
It is important to understand what the second law inequality does not imply. It does not imply that the ensemble averaged entropy production is non-negative at all times. This is untrue, as consideration of the entropy production in a viscoelastic fluid subject to a sinusoidal time dependent shear rate shows (e.g., rogue waves).[ clarification needed ][ dubious – discuss ] In this example the ensemble average of the time integral of the entropy production over one cycle is however nonnegative – as expected from the second law inequality.
Another remarkably simple and elegant consequence of the fluctuation theorem is the so-called "nonequilibrium partition identity" (NPI): [8]
Thus in spite of the second law inequality, which might lead you to expect that the average would decay exponentially with time, the exponential probability ratio given by the FT exactly cancels the negative exponential in the average above leading to an average which is unity for all time.
There are many important implications from the fluctuation theorem. One is that small machines (such as nanomachines or even mitochondria in a cell) will spend part of their time actually running in "reverse". What we mean by "reverse" is that it is possible to observe that these small molecular machines are able to generate work by taking heat from the environment. This is possible because there exists a symmetry relation in the work fluctuations associated with the forward and reverse changes a system undergoes as it is driven away from thermal equilibrium by the action of an external perturbation, which is a result predicted by the Crooks fluctuation theorem. The environment itself continuously drives these molecular machines away from equilibrium and the fluctuations it generates over the system are very relevant because the probability of observing an apparent violation of the second law of thermodynamics becomes significant at this scale.
This is counterintuitive because, from a macroscopic point of view, it would describe complex processes running in reverse. For example, a jet engine running in reverse, taking in ambient heat and exhaust fumes to generate kerosene and oxygen. Nevertheless the size of such a system makes this observation almost impossible to occur. Such a process is possible to be observed microscopically because, as it has been stated above, the probability of observing a "reverse" trajectory depends on system size and is significant for molecular machines if an appropriate measurement instrument is available. This is the case with the development of new biophysical instruments such as the optical tweezers or the atomic force microscope. Crooks fluctuation theorem has been verified through RNA folding experiments. [9]
Strictly speaking the fluctuation theorem refers to a quantity known as the dissipation function. In thermostatted nonequilibrium states[ clarification needed ] that are close to equilibrium, the long time average of the dissipation function is equal to the average entropy production. However the FT refers to fluctuations rather than averages. The dissipation function is defined as
where k is the Boltzmann constant, is the initial (t = 0) distribution of molecular states , and is the molecular state arrived at after time t, under the exact time reversible equations of motion. is the INITIAL distribution of those time evolved states.
Note: in order for the FT to be valid we require that . This condition is known as the condition of ergodic consistency. It is widely satisfied in common statistical ensembles - e.g. the canonical ensemble.
The system may be in contact with a large heat reservoir in order to thermostat the system of interest. If this is the case is the heat lost to the reservoir over the time (0,t) and T is the absolute equilibrium temperature of the reservoir. [10] With this definition of the dissipation function the precise statement of the FT simply replaces entropy production with the dissipation function in each of the FT equations above.
Example: If one considers electrical conduction across an electrical resistor in contact with a large heat reservoir at temperature T, then the dissipation function is
the total electric current density J multiplied by the voltage drop across the circuit, , and the system volume V, divided by the absolute temperature T, of the heat reservoir times the Boltzmann constant. Thus the dissipation function is easily recognised as the Ohmic work done on the system divided by the temperature of the reservoir. Close to equilibrium the long time average of this quantity is (to leading order in the voltage drop), equal to the average spontaneous entropy production per unit time. [11] However, the fluctuation theorem applies to systems arbitrarily far from equilibrium where the definition of the spontaneous entropy production is problematic.
The second law of thermodynamics, which predicts that the entropy of an isolated system out of equilibrium should tend to increase rather than decrease or stay constant, stands in apparent contradiction with the time-reversible equations of motion for classical and quantum systems. The time reversal symmetry of the equations of motion show that if one films a given time dependent physical process, then playing the movie of that process backwards does not violate the laws of mechanics. It is often argued that for every forward trajectory in which entropy increases, there exists a time reversed anti trajectory where entropy decreases, thus if one picks an initial state randomly from the system's phase space and evolves it forward according to the laws governing the system, decreasing entropy should be just as likely as increasing entropy. It might seem that this is incompatible with the second law of thermodynamics which predicts that entropy tends to increase. The problem of deriving irreversible thermodynamics from time-symmetric fundamental laws is referred to as Loschmidt's paradox.
The mathematical derivation of the fluctuation theorem and in particular the second law inequality shows that, for a nonequilibrium process, the ensemble averaged value for the dissipation function will be greater than zero. [12] This result requires causality, i.e. that cause (the initial conditions) precede effect (the value taken on by the dissipation function). This is clearly demonstrated in section 6 of that paper, where it is shown how one could use the same laws of mechanics to extrapolate backwards from a later state to an earlier state, and in this case the fluctuation theorem would lead us to predict the ensemble average dissipation function to be negative, an anti-second law. This second prediction, which is inconsistent with the real world, is obtained using an anti-causal assumption. That is to say that effect (the value taken on by the dissipation function) precedes the cause (here the later state has been incorrectly used for the initial conditions). The fluctuation theorem shows how the second law is a consequence of the assumption of causality. When we solve a problem we set the initial conditions and then let the laws of mechanics evolve the system forward in time, we don't solve problems by setting the final conditions and letting the laws of mechanics run backwards in time.
The fluctuation theorem is of fundamental importance to non-equilibrium statistical mechanics. The FT (together with the universal causation proposition) gives a generalisation of the second law of thermodynamics which includes as a special case, the conventional second law. It is then easy to prove the Second Law Inequality and the NonEquilibrium Partition Identity. When combined with the central limit theorem, the FT also implies the Green-Kubo relations for linear transport coefficients, close to equilibrium. The FT is however, more general than the Green-Kubo Relations because unlike them, the FT applies to fluctuations far from equilibrium. In spite of this fact, scientists have not yet been able to derive the equations for nonlinear response theory from the FT.
The FT does not imply or require that the distribution of time averaged dissipation be Gaussian. There are many examples known where the distribution of time averaged dissipation is non-Gaussian and yet the FT (of course) still correctly describes the probability ratios.
Lastly the theoretical constructs used to prove the FT can be applied to nonequilibrium transitions between two different equilibrium states. When this is done the so-called Jarzynski equality or nonequilibrium work relation, can be derived. This equality shows how equilibrium free energy differences can be computed or measured (in the laboratory [13] ), from nonequilibrium path integrals. Previously quasi-static (equilibrium) paths were required.
The reason why the fluctuation theorem is so fundamental is that its proof requires so little. It requires:
In regard to the latter "assumption", while the equations of motion of quantum dynamics may be time-reversible, quantum processes are nondeterministic by nature. What state a wave function collapses into cannot be predicted mathematically, and further the unpredictability of a quantum system comes not from the myopia of an observer’s perception, but on the intrinsically nondeterministic nature of the system itself.
In physics, the laws of motion of classical mechanics exhibit time reversibility, as long as the operator π reverses the conjugate momenta of all the particles of the system, i.e. (T-symmetry).
In quantum mechanical systems, however, the weak nuclear force is not invariant under T-symmetry alone; if weak interactions are present reversible dynamics are still possible, but only if the operator π also reverses the signs of all the charges and the parity of the spatial co-ordinates (C-symmetry and P-symmetry). This reversibility of several linked properties is known as CPT symmetry.
Thermodynamic processes can be reversible or irreversible, depending on the change in entropy during the process.
Equation (61)
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, neuroscience, computer science, information theory and sociology. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency of the quantity H to decrease in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists.
Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions.
The fluctuation–dissipation theorem (FDT) or fluctuation–dissipation relation (FDR) is a powerful tool in statistical physics for predicting the behavior of systems that obey detailed balance. Given that a system obeys detailed balance, the theorem is a proof that thermodynamic fluctuations in a physical variable predict the response quantified by the admittance or impedance of the same physical variable, and vice versa. The fluctuation–dissipation theorem applies both to classical and quantum mechanical systems.
In physics, Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox, or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.
In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.
The Jarzynski equality (JE) is an equation in statistical mechanics that relates free energy differences between two states and the irreversible work along an ensemble of trajectories joining the same states. It is named after the physicist Christopher Jarzynski who derived it in 1996. Fundamentally, the Jarzynski equality points to the fact that the fluctuations in the work satisfy certain constraints separately from the average value of the work that occurs in some process.
The Green–Kubo relations give the exact mathematical expression for a transport coefficient in terms of the integral of the equilibrium time correlation function of the time derivative of a corresponding microscopic variable :
The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.
Denis James Evans is an Australian scientist who is an emeritus professor at the Australian National University and honorary professor at The University of Queensland. He is widely recognised for his contributions to nonequilibrium thermodynamics and nonequilibrium statistical mechanics and the simulation of nonequilibrium fluids.
In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.
In quantum field theory and statistical mechanics, the Hohenberg–Mermin–Wagner theorem or Mermin–Wagner theorem states that continuous symmetries cannot be spontaneously broken at finite temperature in systems with sufficiently short-range interactions in dimensions d ≤ 2. Intuitively, this theorem implies that long-range fluctuations can be created with little energy cost, and since they increase the entropy, they are favored.
Debra Bernhardt is an Australian theoretical chemist. She is best known for her contributions towards understanding the fluctuation theorem. This theorem shows the second law of thermodynamics and the zeroth law of thermodynamics can be derived mathematically rather than postulated as laws of nature.
The nonequilibrium partition identity (NPI) is a remarkably simple and elegant consequence of the fluctuation theorem previously known as the Kawasaki identity:
Energy dissipation and entropy production extremal principles are ideas developed within non-equilibrium thermodynamics that attempt to predict the likely steady states and dynamical structures that a physical system might show. The search for extremum principles for non-equilibrium thermodynamics follows their successful use in other branches of physics. According to Kondepudi (2008), and to Grandy (2008), there is no general rule that provides an extremum principle that governs the evolution of a far-from-equilibrium system to a steady state. According to Glansdorff and Prigogine, irreversible processes usually are not governed by global extremal principles because description of their evolution requires differential equations which are not self-adjoint, but local extremal principles can be used for local solutions. Lebon Jou and Casas-Vásquez (2008) state that "In non-equilibrium ... it is generally not possible to construct thermodynamic potentials depending on the whole set of variables". Šilhavý (1997) offers the opinion that "... the extremum principles of thermodynamics ... do not have any counterpart for [non-equilibrium] steady states ." It follows that any general extremal principle for a non-equilibrium problem will need to refer in some detail to the constraints that are specific for the structure of the system considered in the problem.
In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.
Quantum thermodynamics is the study of the relations between two independent physical theories: thermodynamics and quantum mechanics. The two independent theories address the physical phenomena of light and matter. In 1905, Albert Einstein argued that the requirement of consistency between thermodynamics and electromagnetism leads to the conclusion that light is quantized, obtaining the relation . This paper is the dawn of quantum theory. In a few decades quantum theory became established with an independent set of rules. Currently quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. It differs from quantum statistical mechanics in the emphasis on dynamical processes out of equilibrium. In addition, there is a quest for the theory to be relevant for a single individual quantum system.
Stochastic thermodynamics is an emergent field of research in statistical mechanics that uses stochastic variables to better understand the non-equilibrium dynamics present in many microscopic systems such as colloidal particles, biopolymers, enzymes, and molecular motors.