Coarse-grained modeling

Last updated

Coarse-grained modeling, coarse-grained models, aim at simulating the behaviour of complex systems using their coarse-grained (simplified) representation. Coarse-grained models are widely used for molecular modeling of biomolecules [1] [2] at various granularity levels.

A wide range of coarse-grained models have been proposed. They are usually dedicated to computational modeling of specific molecules: proteins, [1] [2] nucleic acids, [3] [4] lipid membranes, [2] [5] carbohydrates [6] or water. [7] In these models, molecules are represented not by individual atoms, but by "pseudo-atoms" approximating groups of atoms, such as whole amino acid residue. By decreasing the degrees of freedom much longer simulation times can be studied at the expense of molecular detail. Coarse-grained models have found practical applications in molecular dynamics simulations. [1] Another case of interest is the simplification of a given discrete-state system, as very often descriptions of the same system at different levels of detail are possible. [8] [9] An example is given by the chemomechanical dynamics of a molecular machine, such as Kinesin. [8] [10]

The coarse-grained modeling originates from work by Michael Levitt and Ariel Warshel in 1970s. [11] [12] [13] Coarse-grained models are presently often used as components of multiscale modeling protocols in combination with reconstruction tools [14] (from coarse-grained to atomistic representation) and atomistic resolution models. [1] Atomistic resolution models alone are presently not efficient enough to handle large system sizes and simulation timescales. [1] [2]

Coarse graining and fine graining in statistical mechanics addresses the subject of entropy , and thus the second law of thermodynamics. One has to realise that the concept of temperature cannot be attributed to an arbitrarily microscopic particle since this does not radiate thermally like a macroscopic or "black body". However, one can attribute a nonzero entropy to an object with as few as two states like a "bit" (and nothing else). The entropies of the two cases are called thermal entropy and von Neumann entropy respectively. [15] They are also distinguished by the terms coarse grained and fine grained respectively. This latter distinction is related to the aspect spelled out above and is elaborated on below.

The Liouville theorem (sometimes also called Liouville equation)

states that a phase space volume (spanned by and , here in one spatial dimension) remains constant in the course of time, no matter where the point contained in moves. This is a consideration in classical mechanics. In order to relate this view to macroscopic physics one surrounds each point e.g. with a sphere of some fixed volume - a procedure called coarse graining which lumps together points or states of similar behaviour. The trajectory of this sphere in phase space then covers also other points and hence its volume in phase space grows. The entropy associated with this consideration, whether zero or not, is called coarse grained entropy or thermal entropy. A large number of such systems, i.e. the one under consideration together with many copies, is called an ensemble. If these systems do not interact with each other or anything else, and each has the same energy , the ensemble is called a microcanonical ensemble. Each replica system appears with the same probability, and temperature does not enter.

Now suppose we define a probability density describing the motion of the point with phase space element . In the case of equilibrium or steady motion the equation of continuity implies that the probability density is independent of time . We take as nonzero only inside the phase space volume . One then defines the entropy by the relation

where

Then,by maximisation for a given energy , i.e. linking with of the other sum equal to zero via a Lagrange multiplier , one obtains (as in the case of a lattice of spins or with a bit at each lattice point)

and ,

the volume of being proportional to the exponential of S. This is again a consideration in classical mechanics.

In quantum mechanics the phase space becomes a space of states, and the probability density an operator with a subspace of states of dimension or number of states specified by a projection operator . Then the entropy is (obtained as above)

and is described as fine grained or von Neumann entropy. If , the entropy vanishes and the system is said to be in a pure state. Here the exponential of S is proportional to the number of states. The microcanonical ensemble is again a large number of noninteracting copies of the given system and , energy etc. become ensemble averages.

Now consider interaction of a given system with another one - or in ensemble terminology - the given system and the large number of replicas all immersed in a big one called a heat bath characterised by . Since the systems interact only via the heat bath, the individual systems of the ensemble can have different energies depending on which energy state they are in. This interaction is described as entanglement and the ensemble as canonical ensemble (the macrocanonical ensemble permits also exchange of particles).

The interaction of the ensemble elements via the heat bath leads to temperature , as we now show. [16] Considering two elements with energies , the probability of finding these in the heat bath is proportional to , and this is proportional to if we consider the binary system as a system in the same heat bath defined by the function . It follows that (the only way to satisfy the proportionality), where is a constant. Normalisation then implies

Then in terms of ensemble averages

, and

or by comparison with the second law of thermodynamics. is now the entanglement entropy or fine grained von Neumann entropy. This is zero if the system is in a pure state, and is nonzero when in a mixed (entangled) state.

Above we considered a system immersed in another huge one called heat bath with the possibility of allowing heat exchange between them. Frequently one considers a different situation, i.e. two systems A and B with a small hole in the partition between them. Suppose B is originally empty but A contains an explosive device which fills A instantaneously with photons. Originally A and B have energies and respectively, and there is no interaction. Hence originally both are in pure quantum states and have zero fine grained entropies. Immediately after explosion A is filled with photons, the energy still being and that of B also (no photon has yet escaped). Since A is filled with photons, these obey a Planck distribution law and hence the coarse grained thermal entropy of A is nonzero (recall: lots of configurations of the photons in A, lots of states with one maximal), although the fine grained quantum mechanical entropy is still zero (same energy state), as also that of B. Now allow photons to leak slowly (i.e. with no disturbance of the equilibrium) from A to B. With fewer photons in A, its coarse grained entropy diminishes but that of B increases. This entanglement of A and B implies they are now quantum mechanically in mixed states, and so their fine grained entropies are no longer zero. Finally when all photons are in B, the coarse grained entropy of A as well as its fine grained entropy vanish and A is again in a pure state but with new energy. On the other hand B now has an increased thermal entropy, but since the entanglement is over it is quantum mechanically again in a pure state, its ground state, and that has zero fine grained von Neumann entropy. Consider B: In the course of the entanglement with A its fine grained or entanglement entropy started and ended in pure states (thus with zero entropies). Its coarse grained entropy, however, rose from zero to its final nonzero value. Roughly half way through the procedure the entanglement entropy of B reaches a maximum and then decreases to zero at the end.

The classical coarse grained thermal entropy of the second law of thermodynamics is not the same as the (mostly smaller) quantum mechanical fine grained entropy. The difference is called information. As may be deduced from the foregoing arguments, this difference is roughly zero before the entanglement entropy (which is the same for A and B) attains its maximum. An example of coarse graining is provided by Brownian motion. [17]

Software packages

Related Research Articles

<span class="mw-page-title-main">Quantum entanglement</span> Correlation between quantum systems

Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.

<span class="mw-page-title-main">Stimulated emission</span> Release of a photon triggered by another

Stimulated emission is the process by which an incoming photon of a specific frequency can interact with an excited atomic electron, causing it to drop to a lower energy level. The liberated energy transfers to the electromagnetic field, creating a new photon with a frequency, polarization, and direction of travel that are all identical to the photons of the incident wave. This is in contrast to spontaneous emission, which occurs at a characteristic rate for each of the atoms/oscillators in the upper energy state regardless of the external electromagnetic field.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

<span class="mw-page-title-main">Partition function (statistical mechanics)</span> Function in thermodynamics and statistical physics

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

In physics, Liouville's theorem, named after the French mathematician Joseph Liouville, is a key theorem in classical statistical and Hamiltonian mechanics. It asserts that the phase-space distribution function is constant along the trajectories of the system—that is that the density of system points in the vicinity of a given system point traveling through phase-space is constant with time. This time-independent density is in statistical mechanics known as the classical a priori probability.

<span class="mw-page-title-main">Isentropic process</span> Thermodynamic process that is reversible and adiabatic

An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

The equilibrium constant of a chemical reaction is the value of its reaction quotient at chemical equilibrium, a state approached by a dynamic chemical system after sufficient time has elapsed at which its composition has no measurable tendency towards further change. For a given set of reaction conditions, the equilibrium constant is independent of the initial analytical concentrations of the reactant and product species in the mixture. Thus, given the initial composition of a system, known equilibrium constant values can be used to determine the composition of the system at equilibrium. However, reaction parameters like temperature, solvent, and ionic strength may all influence the value of the equilibrium constant.

In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is

The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random walk to build the density of states by quickly visiting all the available energy spectrum. The Wang and Landau algorithm is an important method to obtain the density of states required to perform a multicanonical simulation.

In astrophysics, what is referred to as "entropy" is actually the adiabatic constant derived as follows.

The Widom insertion method is a statistical thermodynamic approach to the calculation of material and mixture properties. It is named for Benjamin Widom, who derived it in 1963. In general, there are two theoretical approaches to determining the statistical mechanical properties of materials. The first is the direct calculation of the overall partition function of the system, which directly yields the system free energy. The second approach, known as the Widom insertion method, instead derives from calculations centering on one molecule. The Widom insertion method directly yields the chemical potential of one component rather than the system free energy. This approach is most widely applied in molecular computer simulations but has also been applied in the development of analytical statistical mechanical models. The Widom insertion method can be understood as an application of the Jarzynski equality since it measures the excess free energy difference via the average work needed to perform, when changing the system from a state with N molecules to a state with N+1 molecules. Therefore it measures the excess chemical potential since , where .

In quantum mechanics, and especially quantum information theory, the purity of a normalized quantum state is a scalar defined as where is the density matrix of the state and is the trace operation. The purity defines a measure on quantum states, giving information on how much a state is mixed.

<span class="mw-page-title-main">Mass–action ratio</span> Definition of the mass-action ratio, as used in chemistry

The mass–action ratio, often denoted by , is the ratio of the product concentrations, p, to reactant concentrations, s. The concentrations may or may not be at equilibrium.

In quantum mechanics, negativity is a measure of quantum entanglement which is easy to compute. It is a measure deriving from the PPT criterion for separability. It has shown to be an entanglement monotone and hence a proper measure of entanglement.

The entropy of entanglement is a measure of the degree of quantum entanglement between two subsystems constituting a two-part composite quantum system. Given a pure bipartite quantum state of the composite system, it is possible to obtain a reduced density matrix describing knowledge of the state of a subsystem. The entropy of entanglement is the Von Neumann entropy of the reduced density matrix for any of the subsystems. If it is non-zero, i.e. the subsystem is in a mixed state, it indicates the two subsystems are entangled.

A depletion force is an effective attractive force that arises between large colloidal particles that are suspended in a dilute solution of depletants, which are smaller solutes that are preferentially excluded from the vicinity of the large particles. One of the earliest reports of depletion forces that lead to particle coagulation is that of Bondy, who observed the separation or "creaming" of rubber latex upon addition of polymer depletant molecules to solution. More generally, depletants can include polymers, micelles, osmolytes, ink, mud, or paint dispersed in a continuous phase.

Maximal entropy random walk (MERW) is a popular type of biased random walk on a graph, in which transition probabilities are chosen accordingly to the principle of maximum entropy, which says that the probability distribution which best represents the current state of knowledge is the one with largest entropy. While standard random walk chooses for every vertex uniform probability distribution among its outgoing edges, locally maximizing entropy rate, MERW maximizes it globally by assuming uniform probability distribution among all paths in a given graph.

The Ryu–Takayanagi conjecture is a conjecture within holography that posits a quantitative relationship between the entanglement entropy of a conformal field theory and the geometry of an associated anti-de Sitter spacetime. The formula characterizes "holographic screens" in the bulk; that is, it specifies which regions of the bulk geometry are "responsible to particular information in the dual CFT". The conjecture is named after Shinsei Ryu and Tadashi Takayanagi, who jointly published the result in 2006. As a result, the authors were awarded the 2015 New Horizons in Physics Prize for "fundamental ideas about entropy in quantum field theory and quantum gravity". The formula was generalized to a covariant form in 2007.

Guderley–Landau–Stanyukovich problem describes the time evolution of converging shock waves. The problem was discussed by G. Guderley in 1942 and independently by Lev Landau and K. P. Stanyukovich in 1944, where the later authors' analysis was published in 1955.

References

  1. 1 2 3 4 5 Kmiecik S, Gront D, Kolinski M, Wieteska L, Dawid AE, Kolinski A (July 2016). "Coarse-Grained Protein Models and Their Applications". Chemical Reviews. 116 (14): 7898–936. doi: 10.1021/acs.chemrev.6b00163 . PMID   27333362.
  2. 1 2 3 4 Ingólfsson HI, Lopez CA, Uusitalo JJ, de Jong DH, Gopal SM, Periole X, Marrink SJ (May 2014). "The power of coarse graining in biomolecular simulations". Wiley Interdisciplinary Reviews: Computational Molecular Science. 4 (3): 225–248. doi:10.1002/wcms.1169. PMC   4171755 . PMID   25309628.
  3. Boniecki MJ, Lach G, Dawson WK, Tomala K, Lukasz P, Soltysinski T, et al. (April 2016). "SimRNA: a coarse-grained method for RNA folding simulations and 3D structure prediction". Nucleic Acids Research. 44 (7): e63. doi:10.1093/nar/gkv1479. PMC   4838351 . PMID   26687716.
  4. Potoyan DA, Savelyev A, Papoian GA (2013-01-01). "Recent successes in coarse-grained modeling of DNA". Wiley Interdisciplinary Reviews: Computational Molecular Science. 3 (1): 69–83. doi:10.1002/wcms.1114. ISSN   1759-0884. S2CID   12043343.
  5. Baron R, Trzesniak D, de Vries AH, Elsener A, Marrink SJ, van Gunsteren WF (February 2007). "Comparison of thermodynamic properties of coarse-grained and atomic-level simulation models" (PDF). ChemPhysChem. 8 (3): 452–61. doi:10.1002/cphc.200600658. hdl: 11370/92eedd39-1d54-45a4-bd8b-066349852bfb . PMID   17290360.
  6. López CA, Rzepiela AJ, de Vries AH, Dijkhuizen L, Hünenberger PH, Marrink SJ (December 2009). "Martini Coarse-Grained Force Field: Extension to Carbohydrates". Journal of Chemical Theory and Computation. 5 (12): 3195–210. doi:10.1021/ct900313w. PMID   26602504.
  7. Hadley KR, McCabe C (July 2012). "Coarse-Grained Molecular Models of Water: A Review". Molecular Simulation. 38 (8–9): 671–681. doi:10.1080/08927022.2012.671942. PMC   3420348 . PMID   22904601.
  8. 1 2 Seiferth D, Sollich P, Klumpp S (December 2020). "Coarse graining of biochemical systems described by discrete stochastic dynamics". Physical Review E. 102 (6–1): 062149. arXiv: 2102.13394 . Bibcode:2020PhRvE.102f2149S. doi:10.1103/PhysRevE.102.062149. PMID   33466014. S2CID   231652939.
  9. Hummer G, Szabo A (July 2015). "Optimal Dimensionality Reduction of Multistate Kinetic and Markov-State Models". The Journal of Physical Chemistry B. 119 (29): 9029–37. doi:10.1021/jp508375q. PMC   4516310 . PMID   25296279.
  10. Liepelt S, Lipowsky R (June 2007). "Kinesin's network of chemomechanical motor cycles". Physical Review Letters. 98 (25): 258102. Bibcode:2007PhRvL..98y8102L. doi:10.1103/PhysRevLett.98.258102. PMID   17678059.
  11. Levitt M, Warshel A (February 1975). "Computer simulation of protein folding". Nature. 253 (5494): 694–8. Bibcode:1975Natur.253..694L. doi:10.1038/253694a0. PMID   1167625. S2CID   4211714.
  12. Warshel A, Levitt M (May 1976). "Theoretical studies of enzymic reactions: dielectric, electrostatic and steric stabilization of the carbonium ion in the reaction of lysozyme". Journal of Molecular Biology. 103 (2): 227–49. doi:10.1016/0022-2836(76)90311-9. PMID   985660.
  13. Levitt M (September 2014). "Birth and future of multiscale modeling for macromolecular systems (Nobel Lecture)". Angewandte Chemie. 53 (38): 10006–18. doi:10.1002/anie.201403691. PMID   25100216. S2CID   3680673.
  14. Badaczewska-Dawid AE, Kolinski A, Kmiecik S (2020). "Computational reconstruction of atomistic protein structures from coarse-grained models". Computational and Structural Biotechnology Journal. 18: 162–176. doi:10.1016/j.csbj.2019.12.007. PMC   6961067 . PMID   31969975.
  15. Susskind L, Lindesay J (2005). Black Holes, Information and the String Theory Revolution. World Scientific. pp. 69–77. ISBN   981-256-131-5.
  16. Müller-Kirsten HJ (2013). Basics of Statistical Physics (2nd ed.). World Scientific. pp. 28–31, 152–167. ISBN   978-981-4449-53-3.
  17. Muntean A, Rademacher JD, Zagaris A (2016). Macroscopic and Large Scale Phenomena: Coarse Graining, Mean Field Limits and Ergodicity. Springer. ISBN   978-3-319-26883-5.