Molecular demon

Last updated
Fig.1 Schematic figure of Maxwell's demon thought experiment. The demon distinguishes fast-moving molecules from slow-moving ones and opens the small hatch selectively to let the fast-moving molecules pass from a A to B and the slow-moving molecules from B to A. Compartment B heats up whereas A cools down with respect to the average temperature although no work is done. It seems there is a contradiction with the second law of thermodynamics. But the ability to distinguish requires the gain of information, which is a form of energy; therefore. the system obeys the second law of thermodynamics inasmuch as information is first gained but then erased. Maxwell's demon.svg
Fig.1 Schematic figure of Maxwell's demon thought experiment. The demon distinguishes fast-moving molecules from slow-moving ones and opens the small hatch selectively to let the fast-moving molecules pass from a A to B and the slow-moving molecules from B to A. Compartment B heats up whereas A cools down with respect to the average temperature although no work is done. It seems there is a contradiction with the second law of thermodynamics. But the ability to distinguish requires the gain of information, which is a form of energy; therefore. the system obeys the second law of thermodynamics inasmuch as information is first gained but then erased.

A molecular demon or biological molecular machine is a biological macromolecule that resembles and seems to have the same properties as Maxwell's demon. These macromolecules gather information in order to recognize their substrate or ligand within a myriad of other molecules floating in the intracellular or extracellular plasm. This molecular recognition represents an information gain which is equivalent to an energy gain or decrease in entropy. When the demon is reset i.e. when the ligand is released, the information is erased, energy is dissipated and entropy increases obeying the second law of thermodynamics. [1] The difference between biological molecular demons and the thought experiment of Maxwell's demon is the latter's apparent violation of the second law. [2] [3]

Contents

Fig.2 The protein demon (blue) and the substrate or ligand (orange) go through a cycle in which the electromagnetic interaction (1' --> 2) between the two, following the induced fit, causes a conformational change upon which the substrate is released (2'). Hydrolysis of ATP brings the protein back to its original state Cycle molecular demon.jpg
Fig.2 The protein demon (blue) and the substrate or ligand (orange) go through a cycle in which the electromagnetic interaction (1' --> 2) between the two, following the induced fit, causes a conformational change upon which the substrate is released (2'). Hydrolysis of ATP brings the protein back to its original state

Cycle

The molecular demon switches mainly between two conformations. The first, or basic state, upon recognizing and binding the ligand or substrate following an induced fit, undergoes a change in conformation which leads to the second quasi-stable state: the protein-ligand complex. In order to reset the protein to its original, basic state, it needs ATP. When ATP is consumed or hydrolyzed, the ligand is released and the demon acquires again information reverting to its basic state. The cycle may start again. [1]

Ratchet

The second law of thermodynamics is a statistical law. Hence, occasionally, single molecules may not obey the law. All molecules are subject to the molecular storm, i.e. the random movement of molecules in the cytoplasm and the extracellular fluid. Molecular demons or molecular machines either biological or artificially constructed are continuously pushed around by the random thermal motion in a direction that sometimes violates the law. When this happens and the gliding back of the macromolecule from the movement it had made or the conformational change it underwent to its original state can be prevented, as is the case with molecular demons, the molecule works as a ratchet; [4] [5] it is possible to observe for example the creation of a gradient of ions or other molecules across the cell membrane, the movement of motor proteins along filament proteins or also the accumulation of products deriving from an enzymatic reaction. Even some artificial molecular machines and experiments are capable of forming a ratchet apparently defying the second law of thermodynamics. [6] [7] All these molecular demons have to be reset to their original state consuming external energy that is subsequently dissipated as heat. This final step in which entropy increases is therefore irreversible. If the demons were reversible, no work would be done. [5]

Artificial

An example of artificial ratchets is the work by Serreli et al. (2007). [6] Serreli et al. constructed a nanomachine, a rotaxane, that consists of a ring-shaped molecule, that moves along a tiny molecular axle between two different equal compartments, A and B. The normal, random movement of molecules sends the ring back and forth. Since the rings move freely, half of the rotaxanes have the ring on site B and the other half on site A. But the system used by Serreli et al. has a chemical gate on the rotaxane molecule and the axle contains two sticky parts, one at either side of the gate. This gate opens when the ring is close by. The sticky part in B is close to the gate and the rings pass more readily to A than from A to B. They obtained a deviation from equilibrium of 70:50 for A and B respectively, a bit like the demon of Maxwell. But this system works only when light is shone on it and thus needs external energy, just like molecular demons.

Energy and information

Landauer stated that information is physical. [8] His principle sets fundamental thermodynamical constraints for classical and quantum information processing. Much effort has been dedicated to incorporating information into thermodynamics and measuring the entropic and energetic costs of manipulating information. Gaining information, decreases entropy which has an energy cost. This energy has to be collected from the environment. [9] Landauer established equivalence of one bit of information with entropy which is represented by kT ln 2, where k is the Boltzmann constant and T is room temperature. This bound is called the Landauer's limit. [10] Erasing energy increases entropy instead. [11] Toyabe et al. (2010) were able to demonstrate experimentally that information can be converted in free energy. It is a quite elegant experiment that consists of a microscopic particle on a spiral-staircase-like potential. The step has a height corresponding to kBT, where kB is the Boltzmann constant and T is the temperature. The particle jumps between steps due to random thermal motions. Since the downward jumps following the gradient are more frequent than the upward ones, the particle falls down the stairs, on average. But when an upward jump is observed, a block is placed behind the particle to prevent it from falling, just like in a ratchet. This way it should climb the stairs. Information is gained by measuring the particle's location, which is equivalent to a gain in energy, i.e. a decrease in entropy. They used a generalized equation for the second law that contains a variable for information:

ΔF is the free energy between states, W is the work done on the system, k is the Boltzmann constant , T is temperature, and I is the mutual information content obtained by measurements. The brackets indicate that the energy is an average. [7] They could convert the equivalent of one bit information to 0.28 kT ln2 of energy or, in other words, they could exploit more than a quarter of the information’s energy content. [12]

Cognitive demons

In his book Chance and Necessity, Jacques Monod described the functions of proteins and other molecules capable of recognizing with 'elective discrimination' a substrate or ligand or other molecule. [2] In describing these molecules he introduced the term 'cognitive' functions, the same cognitive functions that Maxwell attributed to his demon. Werner Loewenstein goes further and names these molecules 'molecular demon' or 'demon' in short. [1]

Naming the biological molecular machines in this way makes it easier to understand the similarities between these molecules and Maxwell's demon.

Because of this real discriminative if not 'cognitive' property, Jacques Monod attributed a teleonomic function to these biological complexes. Teleonomy implies the idea of an oriented, coherent and constructive activity. Proteins therefore must be considered essential molecular agents in the teleonomic performances of all living beings.

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Solvation</span> Association of molecules of a solvent with molecules or ions of a solute

Solvation describes the interaction of a solvent with dissolved molecules. Both ionized and uncharged molecules interact strongly with a solvent, and the strength and nature of this interaction influence many properties of the solute, including solubility, reactivity, and color, as well as influencing the properties of the solvent such as its viscosity and density. If the attractive forces between the solvent and solute particles are greater than the attractive forces holding the solute particles together, the solvent particles pull the solute particles apart and surround them. The surrounded solute particles then move away from the solid solute and out into the solution. Ions are surrounded by a concentric shell of solvent. Solvation is the process of reorganizing solvent and solute molecules into solvation complexes and involves bond formation, hydrogen bonding, and van der Waals forces. Solvation of a solute by water is called hydration.

<span class="mw-page-title-main">Timeline of thermodynamics</span>

A timeline of events in the history of thermodynamics.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

<span class="mw-page-title-main">Maxwell's demon</span> Thought experiment of 1867

Maxwell's demon is a thought experiment that would hypothetically violate the second law of thermodynamics. It was proposed by the physicist James Clerk Maxwell in 1867. In his first letter, Maxwell referred to the entity as a "finite being" or a "being who can play a game of skill with the molecules". Lord Kelvin would later call it a "demon".

<span class="mw-page-title-main">T-symmetry</span> Time reversal symmetry in physics

T-symmetry or time reversal symmetry is the theoretical symmetry of physical laws under the transformation of time reversal,

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

In physics, Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox, or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.

In physics, an entropic force acting in a system is an emergent phenomenon resulting from the entire system's statistical tendency to increase its entropy, rather than from a particular underlying force on the atomic scale.

Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings.

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

In chemical thermodynamics, conformational entropy is the entropy associated with the number of conformations of a molecule. The concept is most commonly applied to biological macromolecules such as proteins and RNA, but also be used for polysaccharides and other molecules. To calculate the conformational entropy, the possible conformations of the molecule may first be discretized into a finite number of states, usually characterized by unique combinations of certain structural parameters, each of which has been assigned an energy. In proteins, backbone dihedral angles and side chain rotamers are commonly used as parameters, and in RNA the base pairing pattern may be used. These characteristics are used to define the degrees of freedom. The conformational entropy associated with a particular structure or state, such as an alpha-helix, a folded or an unfolded protein structure, is then dependent on the probability of the occupancy of that structure.

<span class="mw-page-title-main">Entropy (order and disorder)</span> Interpretation of entropy as the change in arrangement of a systems particles

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

<span class="mw-page-title-main">Boltzmann's entropy formula</span> Equation in statistical mechanics

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

Chance and Necessity: Essay on the Natural Philosophy of Modern Biology is a 1970 book by Nobel Prize winner Jacques Monod, interpreting the processes of evolution to show that life is only the result of natural processes by "pure chance". The basic tenet of this book is that systems in nature with molecular biology, such as enzymatic biofeedback loops, can be explained without having to invoke final causality.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the kinetic energy of the vibrating and colliding atoms making up a substance.

A depletion force is an effective attractive force that arises between large colloidal particles that are suspended in a dilute solution of depletants, which are smaller solutes that are preferentially excluded from the vicinity of the large particles. One of the earliest reports of depletion forces that lead to particle coagulation is that of Bondy, who observed the separation or "creaming" of rubber latex upon addition of polymer depletant molecules to solution. More generally, depletants can include polymers, micelles, osmolytes, ink, mud, or paint dispersed in a continuous phase.

<span class="mw-page-title-main">Protein–ligand complex</span>

A protein–ligand complex is a complex of a protein bound with a ligand that is formed following molecular recognition between proteins that interact with each other or with other molecules. Formation of a protein-ligand complex is based on molecular recognition between biological macromolecules and ligands, where ligand means any molecule that binds the protein with high affinity and specificity. Molecular recognition is not a process by itself since it is part of a functionally important mechanism involving the essential elements of life like in self-replication, metabolism, and information processing. For example DNA-replication depends on recognition and binding of DNA double helix by helicase, DNA single strand by DNA-polymerase and DNA segments by ligase. Molecular recognition depends on affinity and specificity. Specificity means that proteins distinguish the highly specific binding partner from less specific partners and affinity allows the specific partner with high affinity to remain bound even if there are high concentrations of less specific partners with lower affinity.

References

  1. 1 2 3 R., Loewenstein, Werner (2013-01-29). Physics in mind : a quantum view of the brain. New York. ISBN   9780465029846. OCLC   778420640.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: multiple names: authors list (link)
  2. 1 2 Monod J (1970). Le hasard et la nécessité. Essai sur la philosophie naturelle de la biologie moderne[Chance and necessity Essay on the natural philosophy of modern biology] (in French). Le Seuil.
  3. Maxwell, James Clerk (2009). Niven, W. D (ed.). The Scientific Papers of James Clerk Maxwell. Cambridge: Cambridge University Press. doi:10.1017/cbo9780511698095. hdl:2027/msu.31293102595331. ISBN   9780511698095.
  4. Blomberg, Clas (2007), "BROWNIAN RATCHET: UNIDIRECTIONAL PROCESSES", Physics of Life, Elsevier, pp. 340–343, doi:10.1016/b978-044452798-1/50031-2, ISBN   9780444527981
  5. 1 2 M., Hoffmann, Peter (2012). Life's ratchet : how molecular machines extract order from chaos. Basic Books. ISBN   9780465022533. OCLC   808107321.{{cite book}}: CS1 maint: multiple names: authors list (link)
  6. 1 2 Leigh, David A.; Euan R. Kay; Lee, Chin-Fa; Serreli, Viviana (2007-02-01). "A molecular information ratchet". Nature. 445 (7127): 523–527. Bibcode:2007Natur.445..523S. doi:10.1038/nature05452. ISSN   1476-4687. PMID   17268466. S2CID   4314051.
  7. 1 2 Sano, Masaki; Muneyuki, Eiro; Ueda, Masahito; Sagawa, Takahiro; Toyabe, Shoichi (2010-11-14). "Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality". Nature Physics. 6 (12): 988–992. arXiv: 1009.5287 . Bibcode:2010NatPh...6..988T. doi:10.1038/nphys1821. ISSN   1745-2481. S2CID   118444713.
  8. Landauer, Rolf (1991). "Information is Physical". Physics Today. 44 (5): 23–29. Bibcode:1991PhT....44e..23L. doi:10.1063/1.881299.
  9. Sagawa, Takahiro; Horowitz, Jordan M.; Parrondo, Juan M. R. (2015-02-03). "Thermodynamics of information". Nature Physics. 11 (2): 131–139. arXiv: 2306.12447 . Bibcode:2015NatPh..11..131P. doi:10.1038/nphys3230. ISSN   1745-2481. S2CID   51800981.
  10. Alfonso-Faus, Antonio (2013-06-30). "Fundamental Principle of Information-to-Energy Conversion". Arrivi.org: 4. arXiv: 1401.6052 .
  11. Ball, Philip (2012). "The unavoidable cost of computation revealed". Nature News. doi:10.1038/nature.2012.10186. S2CID   2092541.
  12. "Information converted to energy". Physics World. 2010-11-19. Retrieved 2019-01-30.