This article may be too technical for most readers to understand.(December 2021) |
Parallel tempering, in physics and statistics, is a computer simulation method typically used to find the lowest energy state of a system of many interacting particles. It addresses the problem that at high temperatures, one may have a stable state different from low temperature, whereas simulations at low temperatures may become "stuck" in a metastable state. It does this by using the fact that the high temperature simulation may visit states typical of both stable and metastable low temperature states.
More specifically, parallel tempering (also known as replica exchange MCMC sampling), is a simulation method aimed at improving the dynamic properties of Monte Carlo method simulations of physical systems, and of Markov chain Monte Carlo (MCMC) sampling methods more generally. The replica exchange method was originally devised by Robert Swendsen and J. S. Wang, [1] then extended by Charles J. Geyer, [2] and later developed further by Giorgio Parisi, [3] Koji Hukushima and Koji Nemoto, [4] and others. [5] [6] Y. Sugita and Y. Okamoto also formulated a molecular dynamics version of parallel tempering; this is usually known as replica-exchange molecular dynamics or REMD. [7]
Essentially, one runs N copies of the system, randomly initialized, at different temperatures. Then, based on the Metropolis criterion one exchanges configurations at different temperatures. The idea of this method is to make configurations at high temperatures available to the simulations at low temperatures and vice versa. This results in a very robust ensemble which is able to sample both low and high energy configurations. In this way, thermodynamical properties such as the specific heat, which is in general not well computed in the canonical ensemble, can be computed with great precision.
Typically a Monte Carlo simulation using a Metropolis–Hastings update consists of a single stochastic process that evaluates the energy of the system and accepts/rejects updates based on the temperature T. At high temperatures updates that change the energy of the system are comparatively more probable. When the system is highly correlated, updates are rejected and the simulation is said to suffer from critical slowing down.
If we were to run two simulations at temperatures separated by a ΔT, we would find that if ΔT is small enough, then the energy histograms obtained by collecting the values of the energies over a set of Monte Carlo steps N will create two distributions that will somewhat overlap. The overlap can be defined by the area of the histograms that falls over the same interval of energy values, normalized by the total number of samples. For ΔT = 0 the overlap should approach 1.
Another way to interpret this overlap is to say that system configurations sampled at temperature T1 are likely to appear during a simulation at T2. Because the Markov chain should have no memory of its past, we can create a new update for the system composed of the two systems at T1 and T2. At a given Monte Carlo step we can update the global system by swapping the configuration of the two systems, or alternatively trading the two temperatures. The update is accepted according to the Metropolis–Hastings criterion with probability
and otherwise the update is rejected. The detailed balance condition has to be satisfied by ensuring that the reverse update has to be equally likely, all else being equal. This can be ensured by appropriately choosing regular Monte Carlo updates or parallel tempering updates with probabilities that are independent of the configurations of the two systems or of the Monte Carlo step. [8]
This update can be generalized to more than two systems.
By a careful choice of temperatures and number of systems one can achieve an improvement in the mixing properties of a set of Monte Carlo simulations that exceeds the extra computational cost of running parallel simulations.
Other considerations to be made: increasing the number of different temperatures can have a detrimental effect, as one can think of the 'lateral' movement of a given system across temperatures as a diffusion process. Set up is important as there must be a practical histogram overlap to achieve a reasonable probability of lateral moves.
The parallel tempering method can be used as a super simulated annealing that does not need restart, since a system at high temperature can feed new local optimizers to a system at low temperature, allowing tunneling between metastable states and improving convergence to a global optimum.
In condensed matter physics, a spin glass is a magnetic state characterized by randomness, besides cooperative behavior in freezing of spins at a temperature called 'freezing temperature' Tf. In ferromagnetic solids, component atoms' magnetic spins all align in the same direction. Spin glass when contrasted with a ferromagnet is defined as "disordered" magnetic state in which spins are aligned randomly or without a regular pattern and the couplings too are random.
Molecular dynamics (MD) is a computer simulation method for analyzing the physical movements of atoms and molecules. The atoms and molecules are allowed to interact for a fixed period of time, giving a view of the dynamic "evolution" of the system. In the most common version, the trajectories of atoms and molecules are determined by numerically solving Newton's equations of motion for a system of interacting particles, where forces between the particles and their potential energies are often calculated using interatomic potentials or molecular mechanical force fields. The method is applied mostly in chemical physics, materials science, and biophysics.
Global optimization is a branch of applied mathematics and numerical analysis that attempts to find the global minima or maxima of a function or a set of functions on a given set. It is usually described as a minimization problem because the maximization of the real-valued function is equivalent to the minimization of the function .
The classical XY model is a lattice model of statistical mechanics. In general, the XY model can be seen as a specialization of Stanley's n-vector model for n = 2.
Monte Carlo in statistical physics refers to the application of the Monte Carlo method to problems in statistical physics, or statistical mechanics.
In computational physics, variational Monte Carlo (VMC) is a quantum Monte Carlo method that applies the variational method to approximate the ground state of a quantum system.
Free energy perturbation (FEP) is a method based on statistical mechanics that is used in computational chemistry for computing free energy differences from molecular dynamics or Metropolis Monte Carlo simulations.
Umbrella sampling is a technique in computational physics and chemistry, used to improve sampling of a system where ergodicity is hindered by the form of the system's energy landscape. It was first suggested by Torrie and Valleau in 1977. It is a particular physical application of the more general importance sampling in statistics.
Thermodynamic integration is a method used to compare the difference in free energy between two given states whose potential energies and have different dependences on the spatial coordinates. Because the free energy of a system is not simply a function of the phase space coordinates of the system, but is instead a function of the Boltzmann-weighted integral over phase space, the free energy difference between two states cannot be calculated directly from the potential energy of just two coordinate sets. In thermodynamic integration, the free energy difference is calculated by defining a thermodynamic path between the states and integrating over ensemble-averaged enthalpy changes along the path. Such paths can either be real chemical processes or alchemical processes. An example alchemical process is the Kirkwood's coupling parameter method.
The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random walk to build the density of states by quickly visiting all the available energy spectrum. The Wang and Landau algorithm is an important method to obtain the density of states required to perform a multicanonical simulation.
In applied mathematics, the numerical sign problem is the problem of numerically evaluating the integral of a highly oscillatory function of a large number of variables. Numerical methods fail because of the near-cancellation of the positive and negative contributions to the integral. Each has to be integrated to very high precision in order for their difference to be obtained with useful accuracy.
"Equation of State Calculations by Fast Computing Machines" is a scholarly article published by Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller, and Edward Teller in the Journal of Chemical Physics in 1953. This paper proposed what became known as the Metropolis Monte Carlo algorithm, which forms the basis for Monte Carlo statistical mechanics simulations of atomic and molecular systems.
The demon algorithm is a Monte Carlo method for efficiently sampling members of a microcanonical ensemble with a given energy. An additional degree of freedom, called 'the demon', is added to the system and is able to store and provide energy. If a drawn microscopic state has lower energy than the original state, the excess energy is transferred to the demon. For a sampled state that has higher energy than desired, the demon provides the missing energy if it is available. The demon can not have negative energy and it does not interact with the particles beyond exchanging energy. Note that the additional degree of freedom of the demon does not alter a system with many particles significantly on a macroscopic level.
The Widom insertion method is a statistical thermodynamic approach to the calculation of material and mixture properties. It is named for Benjamin Widom, who derived it in 1963. In general, there are two theoretical approaches to determining the statistical mechanical properties of materials. The first is the direct calculation of the overall partition function of the system, which directly yields the system free energy. The second approach, known as the Widom insertion method, instead derives from calculations centering on one molecule. The Widom insertion method directly yields the chemical potential of one component rather than the system free energy. This approach is most widely applied in molecular computer simulations but has also been applied in the development of analytical statistical mechanical models. The Widom insertion method can be understood as an application of the Jarzynski equality since it measures the excess free energy difference via the average work needed to perform, when changing the system from a state with N molecules to a state with N+1 molecules. Therefore it measures the excess chemical potential since , where .
The Swendsen–Wang algorithm is the first non-local or cluster algorithm for Monte Carlo simulation for large systems near criticality. It has been introduced by Robert Swendsen and Jian-Sheng Wang in 1987 at Carnegie Mellon.
The Bennett acceptance ratio method (BAR) is an algorithm for estimating the difference in free energy between two systems . It was suggested by Charles H. Bennett in 1976.
Metadynamics is a computer simulation method in computational physics, chemistry and biology. It is used to estimate the free energy and other state functions of a system, where ergodicity is hindered by the form of the system's energy landscape. It was first suggested by Alessandro Laio and Michele Parrinello in 2002 and is usually applied within molecular dynamics simulations. MTD closely resembles a number of newer methods such as adaptively biased molecular dynamics, adaptive reaction coordinate forces and local elevation umbrella sampling. More recently, both the original and well-tempered metadynamics were derived in the context of importance sampling and shown to be a special case of the adaptive biasing potential setting. MTD is related to the Wang–Landau sampling.
In statistics and physics, multicanonical ensemble is a Markov chain Monte Carlo sampling technique that uses the Metropolis–Hastings algorithm to compute integrals where the integrand has a rough landscape with multiple local minima. It samples states according to the inverse of the density of states, which has to be known a priori or be computed using other techniques like the Wang and Landau algorithm. Multicanonical sampling is an important technique for spin systems like the Ising model or spin glasses.
In mathematics and physics, surface growth refers to models used in the dynamical study of the growth of a surface, usually by means of a stochastic differential equation of a field.
Replica cluster move in condensed matter physics refers to a family of non-local cluster algorithms used to simulate spin glasses. It is an extension of the Swendsen-Wang algorithm in that it generates non-trivial spin clusters informed by the interaction states on two replicas instead of just one. It is different from the replica exchange method, as it performs a non-local update on a fraction of the sites between the two replicas at the same temperature, while parallel tempering directly exchanges all the spins between two replicas at different temperature. However, the two are often used alongside to achieve state-of-the-art efficiency in simulating spin-glass models.