Lloyd Demetrius

Last updated
Lloyd Demetrius
Lloyd Demetrius, 2018.png
Lloyd Demetrius in 2015
Alma mater University of Cambridge
University of Chicago
Known for Evolutionary entropy and Directionality Theory
Scientific career
Fields Mathematician and theoretical biologist
Institutions Harvard University

Lloyd A. Demetrius is an American mathematician and theoretical biologist at the Department of Organismic and Evolutionary biology, Harvard University. [1] He is best known for the discovery of the concept evolutionary entropy, [2] a statistical parameter that characterizes Darwinian fitness in models of evolutionary processes at various levels of biological organization – molecular, organismic and social. Evolutionary entropy, a generalization of the Gibbs-Boltzmann entropy in statistical thermodynamics, is the cornerstone of directionality theory, an analytical study of evolution by variation and selection. [3] [4] [5] [6] The theory has applications to: a) the development of aging and the evolution of longevity; [7] [8] b) the origin and progression of age related diseases such as cancer, and neurodegenerative disorders such as Alzheimer's disease and Parkinson's disease; [9] [10] c) the evolution of cooperation and the spread of inequality. [11]

Contents

Education

Born in Jamaica, he carried out his undergraduate studies in mathematics at the University of Cambridge, UK. He received his PhD in mathematical biology from the University of Chicago in 1967. He was then a postdoc at the University of California, Berkeley.

Career

Demetrius was a faculty member in a number of mathematics departments in the US from 1969–1979: the University of California, Berkeley; Brown University; Rutgers University; and a research scientist at the Max-Planck-Institute for Biophysical Chemistry, Göttingen (1980–1989) and the Max-Planck-Institute for Molecular Genetics, Berlin. Since 1990, he has been with the Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, first as a visiting professor (1990–1992), and then as an associate in population genetics. He has held visiting professorships at MIT, University of Paris, and was an occupant of a Chaire Municipale, a distinguished visiting professorship at the University of Grenoble. His research includes the application of ergodic theory and the theory of dynamical systems to the study of evolutionary processes in biological and socio-economic systems. He has also pioneered the application of the methodology of quantum mechanics to the study of allometric relations between metabolic rate and generation time in cells. This work is the mathematical basis for the analysis of cancer and neurodegenerative disorders as metabolic and bioenergetic diseases.

Evolutionary Entropy and Directionality Theory

The primary achievements of Demetrius are the discovery of the concept Evolutionary Entropy, and the development of Directionality Theory, a study of the collective properties and evolutionary dynamics of aggregates of organic matter – macromolecules, cells, higher organisms - on the basis of their microscopic structure. [2] [3] [5]

Demetrius has shown that evolutionary entropy is related to the Thermodynamic Entropy of Ludwig Boltzmann and J.W. Gibbs, and Directionality Theory is the natural extension of Statistical Mechanics, the study of the collective behaviour of inorganic matter.

The statistical parameter thermodynamic entropy, discovered by Boltzmann, describes the number of instantaneous microstates corresponding to a given macroscopic state. Evolutionary entropy is related to the multiplicity of trajectories characterizing the temporal progression of instantaneous microstates.

Thermodynamic entropy describes the configuration of the instantaneous microstates, and ignores the effect of interparticle forces. Evolutionary entropy describes the multiplicity of trajectories induced by interparticle forces and defined in terms of temporal progression of instantaneous microstates. The two statistical measures of cooperation are positively correlated when the number of microstates is large, effectively infinite.

Statistical mechanics, one of the pillars of modern Physics, is concerned with deducing the thermodynamic properties of aggregates of inanimate matter from its microstructure. The theory, which is based on the statistical measure thermodynamic entropy, is restricted to the study of collective behaviour in physical and chemical systems whose cooperativity can be effectively measured by Thermodynamic Entropy.

The Directionality Theory of Demetrius pertains to organic matter. It is a phenomelogical and analytic theory based on evolutionary entropy as a measure of the cooperativity between the entities that compose the microstructure. It is an extension of the methodology of statistical mechanics to the study of collective and evolutionary behaviour in biological systems.

A cornerstone of Directionality Theory is the Entropic Selection Principle. The changes in evolutionary entropy due to the process of variation and selection is determined by the resource endowment and the population size. [2] [5]

A corollary of the Entropic Selection Principle is the Fundamental Theorem of Evolution: [2] [4]

I a) Evolutionary entropy increases when the resource endowment is scarce and constant

I b) Evolutionary entropy decreases when the resource endowment is abundant and inconstant.

Directionality Theory and the Second Law of Thermodynamics

Demetrius has exploited the Entropic Selection Principle to solve a long-standing problem at the interface of Physics and Biology. Rudolf Clausius showed that the phenomenological fact: Heat flows spontaneously from hotter bodies to colder bodies – the Second Law of Thermodynamics – implies the existence of a property of matter which he called Entropy. The major achievement of Boltzmann was the statistical mechanics rationale of the Second Law. Boltzmann’s explanation was achieved by introducing the statistical parameter thermodynamic entropy, and relating this statistical measure of cooperativity with the Clausius entropy, a phenomenological construct. Boltzmann’s explanation of the Second Law was based on the celebrated theorem of Statistical Mechanics:

II) In isolated systems, that is systems which are closed to the input of energy and matter, thermodynamic entropy increases.

Demetrius has reconciled the Second Law of Thermodynamics, which pertains to energy transformation in inorganic matter, with the Fundamental Theorem of Evolution, which refers to energy transformation in organic matter. [2] [5] This is achieved by establishing that the directionality principle for evolutionary entropy and the Second Law coincide when the resource production rate that drives the evolutionary system tends to zero, and the number of degrees of freedom of the evolutionary system tends to infinity. [2] [5] This relation, Demetrius has shown, provides a conceptual framework for understanding the origin of life: the transition from an abiotic system, defined by inorganic matter – solids, liquids and gases –, to the emergence of organized chemical assemblies capable of Darwinian evolution. [2]

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Thermodynamics</span> Physics of heat, work, and temperature

Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of thermodynamics which convey a quantitative description using measurable macroscopic physical quantities, but may be explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering, especially physical chemistry, biochemistry, chemical engineering and mechanical engineering, but also in other complex fields such as meteorology.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

<span class="mw-page-title-main">Non-equilibrium thermodynamics</span> Branch of thermodynamics

Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

<span class="mw-page-title-main">Irreversible process</span> Process that cannot be undone

In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

<span class="mw-page-title-main">Thermodynamic beta</span> Measure of the coldness of a system

In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system:

In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

<span class="mw-page-title-main">Maximum power principle</span>

The maximum power principle or Lotka's principle has been proposed as the fourth principle of energetics in open system thermodynamics, where an example of an open system is a biological cell. According to Howard T. Odum, "The maximum power principle can be stated: During self-organization, system designs develop and prevail that maximize power intake, energy transformation, and those uses that reinforce production and efficiency."

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

In thermodynamics, the interpretation of entropy as a measure of energy dispersal has been exercised against the background of the traditional view, introduced by Ludwig Boltzmann, of entropy as a quantitative measure of disorder. The energy dispersal approach avoids the ambiguous term 'disorder'. An early advocate of the energy dispersal conception was Edward A. Guggenheim in 1949, using the word 'spread'.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Entropy (order and disorder)</span> Interpretation of entropy as the change in arrangement of a systems particles

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

Research concerning the relationship between the thermodynamic quantity entropy and both the origin and evolution of life began around the turn of the 20th century. In 1910, American historian Henry Adams printed and distributed to university libraries and history professors the small volume A Letter to American Teachers of History proposing a theory of history based on the second law of thermodynamics and on the principle of entropy.

<span class="mw-page-title-main">Boltzmann's entropy formula</span> Equation in statistical mechanics

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

References

  1. "Lloyd Demetrius". Hwpi.harvard.edu.
  2. 1 2 3 4 5 6 7 Demetrius, Lloyd (September 1, 2013). "Boltzmann, Darwin and Directionality Theory". Physics Reports. 530 (1): 1–85. Bibcode:2013PhR...530....1D. doi:10.1016/j.physrep.2013.04.001.
  3. 1 2 Arnold, Ludwig; Gundlach, Volker Matthias; Demetrius, Lloyd (1994). "Evolutionary Formalism for Products of Positive Random Matrices". Ann. Appl. Probab. 4 (3): 859–901. doi: 10.1214/aoap/1177004975 .
  4. 1 2 Demetrius, Lloyd; Gundlach, Volker Matthias (October 20, 2014). "Directionality Theory and the Entropic Principle of Natural Selection". Entropy. 16 (10): 5428–5522. Bibcode:2014Entrp..16.5428D. doi: 10.3390/e16105428 .
  5. 1 2 3 4 5 Lloyd A., Demetrius; Christian, Wolf (2022). "Directionality Theory and the Second Law of Thermodynamics". Physica A. 598: 127325. Bibcode:2022PhyA..59827325D. doi:10.1016/j.physa.2022.127325. S2CID   247995422.
  6. Dietz, Klaus (2005). "Darwinian fitness, evolutionary entropy and directionality theory". BioEssays. 27 (11): 1097–1101. doi:10.1002/bies.20317. PMID   16237668.
  7. Demetrius, L. (2004). "Caloric Restriction, Metabolic Rate, and Entropy". The Journals of Gerontology Series A: Biological Sciences and Medical Sciences. 59 (9): B902–B915. doi: 10.1093/gerona/59.9.B902 . PMID   15472153.
  8. Shaw, Jonathan (November 1, 2004). "A New Theory on Longevity". Harvard Magazine.
  9. Müller-Jung, Joachim. "Das Streitgespräch: Alzheimer: Heilung – wie nah ist man wirklich dran?". Faz.net.
  10. "A new understanding of Alzheimer's". News.harvard.edu. February 25, 2015.
  11. Germano, Fabrizio (2022). "Entropy, directionality theory and the evolution of income inequality". Journal of Economic Behavior and Organization. 198: 15–43. doi: 10.1016/j.jebo.2022.03.017 . hdl: 10230/57605 .