Statistical mechanics |
---|
The Crooks fluctuation theorem (CFT), sometimes known as the Crooks equation, [1] is an equation in statistical mechanics that relates the work done on a system during a non-equilibrium transformation to the free energy difference between the final and the initial state of the transformation. During the non-equilibrium transformation the system is at constant volume and in contact with a heat reservoir. The CFT is named after the chemist Gavin E. Crooks (then at University of California, Berkeley) who discovered it in 1998.
The most general statement of the CFT relates the probability of a space-time trajectory to the time-reversal of the trajectory . The theorem says if the dynamics of the system satisfies microscopic reversibility, then the forward time trajectory is exponentially more likely than the reverse, given that it produces entropy,
If one defines a generic reaction coordinate of the system as a function of the Cartesian coordinates of the constituent particles ( e.g. , a distance between two particles), one can characterize every point along the reaction coordinate path by a parameter , such that and correspond to two ensembles of microstates for which the reaction coordinate is constrained to different values. A dynamical process where is externally driven from zero to one, according to an arbitrary time scheduling, will be referred as forward transformation , while the time reversal path will be indicated as backward transformation. Given these definitions, the CFT sets a relation between the following five quantities:
The CFT equation reads as follows:
In the previous equation the difference corresponds to the work dissipated in the forward transformation, . The probabilities and become identical when the transformation is performed at infinitely slow speed, i.e. for equilibrium transformations. In such cases, and
Using the time reversal relation , and grouping together all the trajectories yielding the same work (in the forward and backward transformation), i.e. determining the probability distribution (or density) of an amount of work being exerted by a random system trajectory from to , we can write the above equation in terms of the work distribution functions as follows
Note that for the backward transformation, the work distribution function must be evaluated by taking the work with the opposite sign. The two work distributions for the forward and backward processes cross at . This phenomenon has been experimentally verified using optical tweezers for the process of unfolding and refolding of a small RNA hairpin and an RNA three-helix junction. [2]
The CFT implies the Jarzynski equality.
Noether's theorem states that every continuous symmetry of the action of a physical system with conservative forces has a corresponding conservation law. This is the first of two theorems proven by mathematician Emmy Noether in 1915 and published in 1918. The action of a physical system is the integral over time of a Lagrangian function, from which the system's behavior can be determined by the principle of least action. This theorem only applies to continuous and smooth symmetries of physical space.
In physics, a Langevin equation is a stochastic differential equation describing how a system evolves when subjected to a combination of deterministic and fluctuating ("random") forces. The dependent variables in a Langevin equation typically are collective (macroscopic) variables changing only slowly in comparison to the other (microscopic) variables of the system. The fast (microscopic) variables are responsible for the stochastic nature of the Langevin equation. One application is to Brownian motion, which models the fluctuating motion of a small particle in a fluid.
In thermodynamics, the Helmholtz free energy is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature (isothermal). The change in the Helmholtz energy during a process is equal to the maximum amount of work that the system can perform in a thermodynamic process in which temperature is held constant. At constant temperature, the Helmholtz free energy is minimized at equilibrium.
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the underlying force laws as the energy scale at which physical processes occur varies, energy/momentum and resolution distance scales being effectively conjugate under the uncertainty principle.
In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.
In physics, mathematics and statistics, scale invariance is a feature of objects or laws that do not change if scales of length, energy, or other variables, are multiplied by a common factor, and thus represent a universality.
In probability theory and related fields, Malliavin calculus is a set of mathematical techniques and ideas that extend the mathematical field of calculus of variations from deterministic functions to stochastic processes. In particular, it allows the computation of derivatives of random variables. Malliavin calculus is also called the stochastic calculus of variations. P. Malliavin first initiated the calculus on infinite dimensional space. Then, the significant contributors such as S. Kusuoka, D. Stroock, J-M. Bismut, Shinzo Watanabe, I. Shigekawa, and so on finally completed the foundations.
A continuous-time Markov chain (CTMC) is a continuous stochastic process in which, for each state, the process will change state according to an exponential random variable and then move to a different state as specified by the probabilities of a stochastic matrix. An equivalent formulation describes the process as changing state according to the least value of a set of exponential random variables, one for each possible state it can move to, with the parameters determined by the current state.
In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class, then the distribution with the largest entropy should be chosen as the least-informative default. The motivation is twofold: first, maximizing entropy minimizes the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time.
The isothermal–isobaric ensemble is a statistical mechanical ensemble that maintains constant temperature and constant pressure applied. It is also called the -ensemble, where the number of particles is also kept as a constant. This ensemble plays an important role in chemistry as chemical reactions are usually carried out under constant pressure condition. The NPT ensemble is also useful for measuring the equation of state of model systems whose virial expansion for pressure cannot be evaluated, or systems near first-order phase transitions.
In theoretical physics, a source field is a background field coupled to the original field as
In mathematics, an eigenvalue perturbation problem is that of finding the eigenvectors and eigenvalues of a system that is perturbed from one with known eigenvectors and eigenvalues . This is useful for studying how sensitive the original system's eigenvectors and eigenvalues are to changes in the system. This type of analysis was popularized by Lord Rayleigh, in his investigation of harmonic vibrations of a string perturbed by small inhomogeneities.
In mathematics, Reidemeister torsion is a topological invariant of manifolds introduced by Kurt Reidemeister for 3-manifolds and generalized to higher dimensions by Wolfgang Franz and Georges de Rham . Analytic torsion is an invariant of Riemannian manifolds defined by Daniel B. Ray and Isadore M. Singer as an analytic analogue of Reidemeister torsion. Jeff Cheeger and Werner Müller proved Ray and Singer's conjecture that Reidemeister torsion and analytic torsion are the same for compact Riemannian manifolds.
In mathematics, the Plancherel theorem for spherical functions is an important result in the representation theory of semisimple Lie groups, due in its final form to Harish-Chandra. It is a natural generalisation in non-commutative harmonic analysis of the Plancherel formula and Fourier inversion formula in the representation theory of the group of real numbers in classical harmonic analysis and has a similarly close interconnection with the theory of differential equations. It is the special case for zonal spherical functions of the general Plancherel theorem for semisimple Lie groups, also proved by Harish-Chandra. The Plancherel theorem gives the eigenfunction expansion of radial functions for the Laplacian operator on the associated symmetric space X; it also gives the direct integral decomposition into irreducible representations of the regular representation on L2(X). In the case of hyperbolic space, these expansions were known from prior results of Mehler, Weyl and Fock.
The Bennett acceptance ratio method (BAR) is an algorithm for estimating the difference in free energy between two systems . It was suggested by Charles H. Bennett in 1976.
Lie point symmetry is a concept in advanced mathematics. Towards the end of the nineteenth century, Sophus Lie introduced the notion of Lie group in order to study the solutions of ordinary differential equations (ODEs). He showed the following main property: the order of an ordinary differential equation can be reduced by one if it is invariant under one-parameter Lie group of point transformations. This observation unified and extended the available integration techniques. Lie devoted the remainder of his mathematical career to developing these continuous groups that have now an impact on many areas of mathematically based sciences. The applications of Lie groups to differential systems were mainly established by Lie and Emmy Noether, and then advocated by Élie Cartan.
Quantum characteristics are phase-space trajectories that arise in the phase space formulation of quantum mechanics through the Wigner transform of Heisenberg operators of canonical coordinates and momenta. These trajectories obey the Hamilton equations in quantum form and play the role of characteristics in terms of which time-dependent Weyl's symbols of quantum operators can be expressed. In the classical limit, quantum characteristics reduce to classical trajectories. The knowledge of quantum characteristics is equivalent to the knowledge of quantum dynamics.
In probability theory and statistics, Campbell's theorem or the Campbell–Hardy theorem is either a particular equation or set of results relating to the expectation of a function summed over a point process to an integral involving the mean measure of the point process, which allows for the calculation of expected value and variance of the random sum. One version of the theorem, also known as Campbell's formula, entails an integral equation for the aforementioned sum over a general point process, and not necessarily a Poisson point process. There also exist equations involving moment measures and factorial moment measures that are considered versions of Campbell's formula. All these results are employed in probability and statistics with a particular importance in the theory of point processes and queueing theory as well as the related fields stochastic geometry, continuum percolation theory, and spatial statistics.
In queueing theory, a discipline within the mathematical theory of probability, a heavy traffic approximation is the matching of a queueing model with a diffusion process under some limiting conditions on the model's parameters. The first such result was published by John Kingman who showed that when the utilisation parameter of an M/M/1 queue is near 1 a scaled version of the queue length process can be accurately approximated by a reflected Brownian motion.
Higher-spin theory or higher-spin gravity is a common name for field theories that contain massless fields of spin greater than two. Usually, the spectrum of such theories contains the graviton as a massless spin-two field, which explains the second name. Massless fields are gauge fields and the theories should be (almost) completely fixed by these higher-spin symmetries. Higher-spin theories are supposed to be consistent quantum theories and, for this reason, to give examples of quantum gravity. Most of the interest in the topic is due to the AdS/CFT correspondence where there is a number of conjectures relating higher-spin theories to weakly coupled conformal field theories. It is important to note that only certain parts of these theories are known at present and not many examples have been worked out in detail except some specific toy models.