Coarse-grained modeling

Last updated

Coarse-grained modeling, coarse-grained models, aim at simulating the behaviour of complex systems using their coarse-grained (simplified) representation. Coarse-grained models are widely used for molecular modeling of biomolecules [1] [2] at various granularity levels.

A wide range of coarse-grained models have been proposed. They are usually dedicated to computational modeling of specific molecules: proteins, [1] [2] nucleic acids, [3] [4] lipid membranes, [2] [5] carbohydrates [6] or water. [7] In these models, molecules are represented not by individual atoms, but by "pseudo-atoms" approximating groups of atoms, such as whole amino acid residue. By decreasing the degrees of freedom much longer simulation times can be studied at the expense of molecular detail. Coarse-grained models have found practical applications in molecular dynamics simulations. [1] Another case of interest is the simplification of a given discrete-state system, as very often descriptions of the same system at different levels of detail are possible. [8] [9] An example is given by the chemomechanical dynamics of a molecular machine, such as Kinesin. [8] [10]

The coarse-grained modeling originates from work by Michael Levitt and Ariel Warshel in 1970s. [11] [12] [13] Coarse-grained models are presently often used as components of multiscale modeling protocols in combination with reconstruction tools [14] (from coarse-grained to atomistic representation) and atomistic resolution models. [1] Atomistic resolution models alone are presently not efficient enough to handle large system sizes and simulation timescales. [1] [2]

Coarse graining and fine graining in statistical mechanics addresses the subject of entropy , and thus the second law of thermodynamics. One has to realise that the concept of temperature cannot be attributed to an arbitrarily microscopic particle since this does not radiate thermally like a macroscopic or "black body". However, one can attribute a nonzero entropy to an object with as few as two states like a "bit" (and nothing else). The entropies of the two cases are called thermal entropy and von Neumann entropy respectively. [15] They are also distinguished by the terms coarse grained and fine grained respectively. This latter distinction is related to the aspect spelled out above and is elaborated on below.

The Liouville theorem (sometimes also called Liouville equation)

states that a phase space volume (spanned by and , here in one spatial dimension) remains constant in the course of time, no matter where the point contained in moves. This is a consideration in classical mechanics. In order to relate this view to macroscopic physics one surrounds each point e.g. with a sphere of some fixed volume - a procedure called coarse graining which lumps together points or states of similar behaviour. The trajectory of this sphere in phase space then covers also other points and hence its volume in phase space grows. The entropy associated with this consideration, whether zero or not, is called coarse grained entropy or thermal entropy. A large number of such systems, i.e. the one under consideration together with many copies, is called an ensemble. If these systems do not interact with each other or anything else, and each has the same energy , the ensemble is called a microcanonical ensemble. Each replica system appears with the same probability, and temperature does not enter.

Now suppose we define a probability density describing the motion of the point with phase space element . In the case of equilibrium or steady motion the equation of continuity implies that the probability density is independent of time . We take as nonzero only inside the phase space volume . One then defines the entropy by the relation

where

Then,by maximisation for a given energy , i.e. linking with of the other sum equal to zero via a Lagrange multiplier , one obtains (as in the case of a lattice of spins or with a bit at each lattice point)

and ,

the volume of being proportional to the exponential of S. This is again a consideration in classical mechanics.

In quantum mechanics the phase space becomes a space of states, and the probability density an operator with a subspace of states of dimension or number of states specified by a projection operator . Then the entropy is (obtained as above)

and is described as fine grained or von Neumann entropy. If , the entropy vanishes and the system is said to be in a pure state. Here the exponential of S is proportional to the number of states. The microcanonical ensemble is again a large number of noninteracting copies of the given system and , energy etc. become ensemble averages.

Now consider interaction of a given system with another one - or in ensemble terminology - the given system and the large number of replicas all immersed in a big one called a heat bath characterised by . Since the systems interact only via the heat bath, the individual systems of the ensemble can have different energies depending on which energy state they are in. This interaction is described as entanglement and the ensemble as canonical ensemble (the macrocanonical ensemble permits also exchange of particles).

The interaction of the ensemble elements via the heat bath leads to temperature , as we now show. [16] Considering two elements with energies , the probability of finding these in the heat bath is proportional to , and this is proportional to if we consider the binary system as a system in the same heat bath defined by the function . It follows that (the only way to satisfy the proportionality), where is a constant. Normalisation then implies

Then in terms of ensemble averages

, and

or by comparison with the second law of thermodynamics. is now the entanglement entropy or fine grained von Neumann entropy. This is zero if the system is in a pure state, and is nonzero when in a mixed (entangled) state.

Above we considered a system immersed in another huge one called heat bath with the possibility of allowing heat exchange between them. Frequently one considers a different situation, i.e. two systems A and B with a small hole in the partition between them. Suppose B is originally empty but A contains an explosive device which fills A instantaneously with photons. Originally A and B have energies and respectively, and there is no interaction. Hence originally both are in pure quantum states and have zero fine grained entropies. Immediately after explosion A is filled with photons, the energy still being and that of B also (no photon has yet escaped). Since A is filled with photons, these obey a Planck distribution law and hence the coarse grained thermal entropy of A is nonzero (recall: lots of configurations of the photons in A, lots of states with one maximal), although the fine grained quantum mechanical entropy is still zero (same energy state), as also that of B. Now allow photons to leak slowly (i.e. with no disturbance of the equilibrium) from A to B. With fewer photons in A, its coarse grained entropy diminishes but that of B increases. This entanglement of A and B implies they are now quantum mechanically in mixed states, and so their fine grained entropies are no longer zero. Finally when all photons are in B, the coarse grained entropy of A as well as its fine grained entropy vanish and A is again in a pure state but with new energy. On the other hand B now has an increased thermal entropy, but since the entanglement is over it is quantum mechanically again in a pure state, its ground state, and that has zero fine grained von Neumann entropy. Consider B: In the course of the entanglement with A its fine grained or entanglement entropy started and ended in pure states (thus with zero entropies). Its coarse grained entropy, however, rose from zero to its final nonzero value. Roughly half way through the procedure the entanglement entropy of B reaches a maximum and then decreases to zero at the end.

The classical coarse grained thermal entropy of the second law of thermodynamics is not the same as the (mostly smaller) quantum mechanical fine grained entropy. The difference is called information. As may be deduced from the foregoing arguments, this difference is roughly zero before the entanglement entropy (which is the same for A and B) attains its maximum. An example of coarse graining is provided by Brownian motion. [17]

Software packages

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Equation of state</span> An equation describing the state of matter under a given set of physical conditions

In physics and chemistry, an equation of state is a thermodynamic equation relating state variables, which describe the state of matter under a given set of physical conditions, such as pressure, volume, temperature, or internal energy. Most modern equations of state are formulated in the Helmholtz free energy. Equations of state are useful in describing the properties of pure substances and mixtures in liquids, gases, and solid states as well as the state of matter in the interior of stars.

<span class="mw-page-title-main">Quantum entanglement</span> Correlation between quantum systems

Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

<span class="mw-page-title-main">Black hole thermodynamics</span> Area of study

In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black hole event horizons. As the study of the statistical mechanics of black-body radiation led to the development of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle.

In statistics, the Wishart distribution is a generalization of the gamma distribution to multiple dimensions. It is named in honor of John Wishart, who first formulated the distribution in 1928. Other names include Wishart ensemble, or Wishart–Laguerre ensemble, or LOE, LUE, LSE.

<span class="mw-page-title-main">Isentropic process</span> Thermodynamic process that is reversible and adiabatic

An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.

The equilibrium constant of a chemical reaction is the value of its reaction quotient at chemical equilibrium, a state approached by a dynamic chemical system after sufficient time has elapsed at which its composition has no measurable tendency towards further change. For a given set of reaction conditions, the equilibrium constant is independent of the initial analytical concentrations of the reactant and product species in the mixture. Thus, given the initial composition of a system, known equilibrium constant values can be used to determine the composition of the system at equilibrium. However, reaction parameters like temperature, solvent, and ionic strength may all influence the value of the equilibrium constant.

In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random walk to build the density of states by quickly visiting all the available energy spectrum. The Wang and Landau algorithm is an important method to obtain the density of states required to perform a multicanonical simulation.

In astrophysics, what is referred to as "entropy" is actually the adiabatic constant derived as follows.

In quantum mechanics, and especially quantum information theory, the purity of a normalized quantum state is a scalar defined as

<span class="mw-page-title-main">Mass–action ratio</span> Definition of the mass-action ratio, as used in chemistry

The mass–action ratio, often denoted by , is the ratio of the product concentrations, p, to reactant concentrations, s. The concentrations may or may not be at equilibrium.

In the theory of quantum communication, an amplitude damping channel is a quantum channel that models physical processes such as spontaneous emission. A natural process by which this channel can occur is a spin chain through which a number of spin states, coupled by a time independent Hamiltonian, can be used to send a quantum state from one location to another. The resulting quantum channel ends up being identical to an amplitude damping channel, for which the quantum capacity, the classical capacity and the entanglement assisted classical capacity of the quantum channel can be evaluated.

The entropy of entanglement is a measure of the degree of quantum entanglement between two subsystems constituting a two-part composite quantum system. Given a pure bipartite quantum state of the composite system, it is possible to obtain a reduced density matrix describing knowledge of the state of a subsystem. The entropy of entanglement is the Von Neumann entropy of the reduced density matrix for any of the subsystems. If it is non-zero, i.e. the subsystem is in a mixed state, it indicates the two subsystems are entangled.

A depletion force is an effective attractive force that arises between large colloidal particles that are suspended in a dilute solution of depletants, which are smaller solutes that are preferentially excluded from the vicinity of the large particles. One of the earliest reports of depletion forces that lead to particle coagulation is that of Bondy, who observed the separation or "creaming" of rubber latex upon addition of polymer depletant molecules to solution. More generally, depletants can include polymers, micelles, osmolytes, ink, mud, or paint dispersed in a continuous phase.

Quantum stochastic calculus is a generalization of stochastic calculus to noncommuting variables. The tools provided by quantum stochastic calculus are of great use for modeling the random evolution of systems undergoing measurement, as in quantum trajectories. Just as the Lindblad master equation provides a quantum generalization to the Fokker–Planck equation, quantum stochastic calculus allows for the derivation of quantum stochastic differential equations (QSDE) that are analogous to classical Langevin equations.

The Ryu–Takayanagi conjecture is a conjecture within holography that posits a quantitative relationship between the entanglement entropy of a conformal field theory and the geometry of an associated anti-de Sitter spacetime. The formula characterizes "holographic screens" in the bulk; that is, it specifies which regions of the bulk geometry are "responsible to particular information in the dual CFT". The conjecture is named after Shinsei Ryu and Tadashi Takayanagi, who jointly published the result in 2006. As a result, the authors were awarded the 2015 New Horizons in Physics Prize for "fundamental ideas about entropy in quantum field theory and quantum gravity". The formula was generalized to a covariant form in 2007.

References

  1. 1 2 3 4 5 Kmiecik S, Gront D, Kolinski M, Wieteska L, Dawid AE, Kolinski A (July 2016). "Coarse-Grained Protein Models and Their Applications". Chemical Reviews. 116 (14): 7898–936. doi: 10.1021/acs.chemrev.6b00163 . PMID   27333362.
  2. 1 2 3 4 Ingólfsson HI, Lopez CA, Uusitalo JJ, de Jong DH, Gopal SM, Periole X, Marrink SJ (May 2014). "The power of coarse graining in biomolecular simulations". Wiley Interdisciplinary Reviews: Computational Molecular Science. 4 (3): 225–248. doi:10.1002/wcms.1169. PMC   4171755 . PMID   25309628.
  3. Boniecki MJ, Lach G, Dawson WK, Tomala K, Lukasz P, Soltysinski T, et al. (April 2016). "SimRNA: a coarse-grained method for RNA folding simulations and 3D structure prediction". Nucleic Acids Research. 44 (7): e63. doi:10.1093/nar/gkv1479. PMC   4838351 . PMID   26687716.
  4. Potoyan DA, Savelyev A, Papoian GA (2013-01-01). "Recent successes in coarse-grained modeling of DNA". Wiley Interdisciplinary Reviews: Computational Molecular Science. 3 (1): 69–83. doi:10.1002/wcms.1114. ISSN   1759-0884. S2CID   12043343.
  5. Baron R, Trzesniak D, de Vries AH, Elsener A, Marrink SJ, van Gunsteren WF (February 2007). "Comparison of thermodynamic properties of coarse-grained and atomic-level simulation models" (PDF). ChemPhysChem. 8 (3): 452–61. doi:10.1002/cphc.200600658. hdl: 11370/92eedd39-1d54-45a4-bd8b-066349852bfb . PMID   17290360.
  6. López CA, Rzepiela AJ, de Vries AH, Dijkhuizen L, Hünenberger PH, Marrink SJ (December 2009). "Martini Coarse-Grained Force Field: Extension to Carbohydrates". Journal of Chemical Theory and Computation. 5 (12): 3195–210. doi:10.1021/ct900313w. PMID   26602504.
  7. Hadley KR, McCabe C (July 2012). "Coarse-Grained Molecular Models of Water: A Review". Molecular Simulation. 38 (8–9): 671–681. doi:10.1080/08927022.2012.671942. PMC   3420348 . PMID   22904601.
  8. 1 2 Seiferth D, Sollich P, Klumpp S (December 2020). "Coarse graining of biochemical systems described by discrete stochastic dynamics". Physical Review E. 102 (6–1): 062149. arXiv: 2102.13394 . Bibcode:2020PhRvE.102f2149S. doi:10.1103/PhysRevE.102.062149. PMID   33466014. S2CID   231652939.
  9. Hummer G, Szabo A (July 2015). "Optimal Dimensionality Reduction of Multistate Kinetic and Markov-State Models". The Journal of Physical Chemistry B. 119 (29): 9029–37. doi:10.1021/jp508375q. PMC   4516310 . PMID   25296279.
  10. Liepelt S, Lipowsky R (June 2007). "Kinesin's network of chemomechanical motor cycles". Physical Review Letters. 98 (25): 258102. Bibcode:2007PhRvL..98y8102L. doi:10.1103/PhysRevLett.98.258102. PMID   17678059.
  11. Levitt M, Warshel A (February 1975). "Computer simulation of protein folding". Nature. 253 (5494): 694–8. Bibcode:1975Natur.253..694L. doi:10.1038/253694a0. PMID   1167625. S2CID   4211714.
  12. Warshel A, Levitt M (May 1976). "Theoretical studies of enzymic reactions: dielectric, electrostatic and steric stabilization of the carbonium ion in the reaction of lysozyme". Journal of Molecular Biology. 103 (2): 227–49. doi:10.1016/0022-2836(76)90311-9. PMID   985660.
  13. Levitt M (September 2014). "Birth and future of multiscale modeling for macromolecular systems (Nobel Lecture)". Angewandte Chemie. 53 (38): 10006–18. doi:10.1002/anie.201403691. PMID   25100216. S2CID   3680673.
  14. Badaczewska-Dawid AE, Kolinski A, Kmiecik S (2020). "Computational reconstruction of atomistic protein structures from coarse-grained models". Computational and Structural Biotechnology Journal. 18: 162–176. doi:10.1016/j.csbj.2019.12.007. PMC   6961067 . PMID   31969975.
  15. Susskind L, Lindesay J (2005). Black Holes, Information and the String Theory Revolution. World Scientific. pp. 69–77. ISBN   981-256-131-5.
  16. Müller-Kirsten HJ (2013). Basics of Statistical Physics (2nd ed.). World Scientific. pp. 28–31, 152–167. ISBN   978-981-4449-53-3.
  17. Muntean A, Rademacher JD, Zagaris A (2016). Macroscopic and Large Scale Phenomena: Coarse Graining, Mean Field Limits and Ergodicity. Springer. ISBN   978-3-319-26883-5.