Coarse-grained modeling, coarse-grained models, aim at simulating the behaviour of complex systems using their coarse-grained (simplified) representation. Coarse-grained models are widely used for molecular modeling of biomolecules [1] [2] at various granularity levels.
A wide range of coarse-grained models have been proposed. They are usually dedicated to computational modeling of specific molecules: proteins, [1] [2] nucleic acids, [3] [4] lipid membranes, [2] [5] carbohydrates [6] or water. [7] In these models, molecules are represented not by individual atoms, but by "pseudo-atoms" approximating groups of atoms, such as whole amino acid residue. By decreasing the degrees of freedom much longer simulation times can be studied at the expense of molecular detail. Coarse-grained models have found practical applications in molecular dynamics simulations. [1] Another case of interest is the simplification of a given discrete-state system, as very often descriptions of the same system at different levels of detail are possible. [8] [9] An example is given by the chemomechanical dynamics of a molecular machine, such as Kinesin. [8] [10]
The coarse-grained modeling originates from work by Michael Levitt and Ariel Warshel in 1970s. [11] [12] [13] Coarse-grained models are presently often used as components of multiscale modeling protocols in combination with reconstruction tools [14] (from coarse-grained to atomistic representation) and atomistic resolution models. [1] Atomistic resolution models alone are presently not efficient enough to handle large system sizes and simulation timescales. [1] [2]
Coarse graining and fine graining in statistical mechanics addresses the subject of entropy , and thus the second law of thermodynamics. One has to realise that the concept of temperature cannot be attributed to an arbitrarily microscopic particle since this does not radiate thermally like a macroscopic or "black body". However, one can attribute a nonzero entropy to an object with as few as two states like a "bit" (and nothing else). The entropies of the two cases are called thermal entropy and von Neumann entropy respectively. [15] They are also distinguished by the terms coarse grained and fine grained respectively. This latter distinction is related to the aspect spelled out above and is elaborated on below.
The Liouville theorem (sometimes also called Liouville equation)
states that a phase space volume (spanned by and , here in one spatial dimension) remains constant in the course of time, no matter where the point contained in moves. This is a consideration in classical mechanics. In order to relate this view to macroscopic physics one surrounds each point e.g. with a sphere of some fixed volume - a procedure called coarse graining which lumps together points or states of similar behaviour. The trajectory of this sphere in phase space then covers also other points and hence its volume in phase space grows. The entropy associated with this consideration, whether zero or not, is called coarse grained entropy or thermal entropy. A large number of such systems, i.e. the one under consideration together with many copies, is called an ensemble. If these systems do not interact with each other or anything else, and each has the same energy , the ensemble is called a microcanonical ensemble. Each replica system appears with the same probability, and temperature does not enter.
Now suppose we define a probability density describing the motion of the point with phase space element . In the case of equilibrium or steady motion the equation of continuity implies that the probability density is independent of time . We take as nonzero only inside the phase space volume . One then defines the entropy by the relation
Then,by maximisation for a given energy , i.e. linking with of the other sum equal to zero via a Lagrange multiplier , one obtains (as in the case of a lattice of spins or with a bit at each lattice point)
the volume of being proportional to the exponential of S. This is again a consideration in classical mechanics.
In quantum mechanics the phase space becomes a space of states, and the probability density an operator with a subspace of states of dimension or number of states specified by a projection operator . Then the entropy is (obtained as above)
and is described as fine grained or von Neumann entropy. If , the entropy vanishes and the system is said to be in a pure state. Here the exponential of S is proportional to the number of states. The microcanonical ensemble is again a large number of noninteracting copies of the given system and , energy etc. become ensemble averages.
Now consider interaction of a given system with another one - or in ensemble terminology - the given system and the large number of replicas all immersed in a big one called a heat bath characterised by . Since the systems interact only via the heat bath, the individual systems of the ensemble can have different energies depending on which energy state they are in. This interaction is described as entanglement and the ensemble as canonical ensemble (the macrocanonical ensemble permits also exchange of particles).
The interaction of the ensemble elements via the heat bath leads to temperature , as we now show. [16] Considering two elements with energies , the probability of finding these in the heat bath is proportional to , and this is proportional to if we consider the binary system as a system in the same heat bath defined by the function . It follows that (the only way to satisfy the proportionality), where is a constant. Normalisation then implies
Then in terms of ensemble averages
or by comparison with the second law of thermodynamics. is now the entanglement entropy or fine grained von Neumann entropy. This is zero if the system is in a pure state, and is nonzero when in a mixed (entangled) state.
Above we considered a system immersed in another huge one called heat bath with the possibility of allowing heat exchange between them. Frequently one considers a different situation, i.e. two systems A and B with a small hole in the partition between them. Suppose B is originally empty but A contains an explosive device which fills A instantaneously with photons. Originally A and B have energies and respectively, and there is no interaction. Hence originally both are in pure quantum states and have zero fine grained entropies. Immediately after explosion A is filled with photons, the energy still being and that of B also (no photon has yet escaped). Since A is filled with photons, these obey a Planck distribution law and hence the coarse grained thermal entropy of A is nonzero (recall: lots of configurations of the photons in A, lots of states with one maximal), although the fine grained quantum mechanical entropy is still zero (same energy state), as also that of B. Now allow photons to leak slowly (i.e. with no disturbance of the equilibrium) from A to B. With fewer photons in A, its coarse grained entropy diminishes but that of B increases. This entanglement of A and B implies they are now quantum mechanically in mixed states, and so their fine grained entropies are no longer zero. Finally when all photons are in B, the coarse grained entropy of A as well as its fine grained entropy vanish and A is again in a pure state but with new energy. On the other hand B now has an increased thermal entropy, but since the entanglement is over it is quantum mechanically again in a pure state, its ground state, and that has zero fine grained von Neumann entropy. Consider B: In the course of the entanglement with A its fine grained or entanglement entropy started and ended in pure states (thus with zero entropies). Its coarse grained entropy, however, rose from zero to its final nonzero value. Roughly half way through the procedure the entanglement entropy of B reaches a maximum and then decreases to zero at the end.
The classical coarse grained thermal entropy of the second law of thermodynamics is not the same as the (mostly smaller) quantum mechanical fine grained entropy. The difference is called information. As may be deduced from the foregoing arguments, this difference is roughly zero before the entanglement entropy (which is the same for A and B) attains its maximum. An example of coarse graining is provided by Brownian motion. [17]
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
Stimulated emission is the process by which an incoming photon of a specific frequency can interact with an excited atomic electron, causing it to drop to a lower energy level. The liberated energy transfers to the electromagnetic field, creating a new photon with a frequency, polarization, and direction of travel that are all identical to the photons of the incident wave. This is in contrast to spontaneous emission, which occurs at a characteristic rate for each of the atoms/oscillators in the upper energy state regardless of the external electromagnetic field.
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.
In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.
In physics, Liouville's theorem, named after the French mathematician Joseph Liouville, is a key theorem in classical statistical and Hamiltonian mechanics. It asserts that the phase-space distribution function is constant along the trajectories of the system—that is that the density of system points in the vicinity of a given system point traveling through phase-space is constant with time. This time-independent density is in statistical mechanics known as the classical a priori probability.
An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
The equilibrium constant of a chemical reaction is the value of its reaction quotient at chemical equilibrium, a state approached by a dynamic chemical system after sufficient time has elapsed at which its composition has no measurable tendency towards further change. For a given set of reaction conditions, the equilibrium constant is independent of the initial analytical concentrations of the reactant and product species in the mixture. Thus, given the initial composition of a system, known equilibrium constant values can be used to determine the composition of the system at equilibrium. However, reaction parameters like temperature, solvent, and ionic strength may all influence the value of the equilibrium constant.
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is
The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random walk to build the density of states by quickly visiting all the available energy spectrum. The Wang and Landau algorithm is an important method to obtain the density of states required to perform a multicanonical simulation.
In astrophysics, what is referred to as "entropy" is actually the adiabatic constant derived as follows.
The Widom insertion method is a statistical thermodynamic approach to the calculation of material and mixture properties. It is named for Benjamin Widom, who derived it in 1963. In general, there are two theoretical approaches to determining the statistical mechanical properties of materials. The first is the direct calculation of the overall partition function of the system, which directly yields the system free energy. The second approach, known as the Widom insertion method, instead derives from calculations centering on one molecule. The Widom insertion method directly yields the chemical potential of one component rather than the system free energy. This approach is most widely applied in molecular computer simulations but has also been applied in the development of analytical statistical mechanical models. The Widom insertion method can be understood as an application of the Jarzynski equality since it measures the excess free energy difference via the average work needed to perform, when changing the system from a state with N molecules to a state with N+1 molecules. Therefore it measures the excess chemical potential since , where .
In quantum mechanics, and especially quantum information theory, the purity of a normalized quantum state is a scalar defined as where is the density matrix of the state and is the trace operation. The purity defines a measure on quantum states, giving information on how much a state is mixed.
The mass–action ratio, often denoted by , is the ratio of the product concentrations, p, to reactant concentrations, s. The concentrations may or may not be at equilibrium.
In quantum mechanics, negativity is a measure of quantum entanglement which is easy to compute. It is a measure deriving from the PPT criterion for separability. It has shown to be an entanglement monotone and hence a proper measure of entanglement.
The entropy of entanglement is a measure of the degree of quantum entanglement between two subsystems constituting a two-part composite quantum system. Given a pure bipartite quantum state of the composite system, it is possible to obtain a reduced density matrix describing knowledge of the state of a subsystem. The entropy of entanglement is the Von Neumann entropy of the reduced density matrix for any of the subsystems. If it is non-zero, i.e. the subsystem is in a mixed state, it indicates the two subsystems are entangled.
A depletion force is an effective attractive force that arises between large colloidal particles that are suspended in a dilute solution of depletants, which are smaller solutes that are preferentially excluded from the vicinity of the large particles. One of the earliest reports of depletion forces that lead to particle coagulation is that of Bondy, who observed the separation or "creaming" of rubber latex upon addition of polymer depletant molecules to solution. More generally, depletants can include polymers, micelles, osmolytes, ink, mud, or paint dispersed in a continuous phase.
Maximal entropy random walk (MERW) is a popular type of biased random walk on a graph, in which transition probabilities are chosen accordingly to the principle of maximum entropy, which says that the probability distribution which best represents the current state of knowledge is the one with largest entropy. While standard random walk chooses for every vertex uniform probability distribution among its outgoing edges, locally maximizing entropy rate, MERW maximizes it globally by assuming uniform probability distribution among all paths in a given graph.
The Ryu–Takayanagi conjecture is a conjecture within holography that posits a quantitative relationship between the entanglement entropy of a conformal field theory and the geometry of an associated anti-de Sitter spacetime. The formula characterizes "holographic screens" in the bulk; that is, it specifies which regions of the bulk geometry are "responsible to particular information in the dual CFT". The conjecture is named after Shinsei Ryu and Tadashi Takayanagi, who jointly published the result in 2006. As a result, the authors were awarded the 2015 New Horizons in Physics Prize for "fundamental ideas about entropy in quantum field theory and quantum gravity". The formula was generalized to a covariant form in 2007.
Guderley–Landau–Stanyukovich problem describes the time evolution of converging shock waves. The problem was discussed by G. Guderley in 1942 and independently by Lev Landau and K. P. Stanyukovich in 1944, where the later authors' analysis was published in 1955.