Elementary Principles in Statistical Mechanics

Last updated
Elementary Principles in Statistical Mechanics
Gibbs-Elementary principles in statistical mechanics.png
Titlepage
Author Josiah Willard Gibbs
CountryUnited States
LanguageEnglish
Subject Statistical mechanics, Mathematical physics
Genrescience, physics
Publisher Charles Scribner's Sons
Publication date
March 1902
Media typePrint (hardback)
Pages207

Elementary Principles in Statistical Mechanics, published in March 1902, is a work of scientific literature by Josiah Willard Gibbs which is considered to be the foundation of modern statistical mechanics. Its full title was Elementary Principles in Statistical Mechanics, developed with especial reference to the rational foundation of thermodynamics. [1]

Contents

Overview

In this book, Gibbs carefully showed how the laws of thermodynamics would arise exactly from a generic classical mechanical system, if one allowed for a certain natural uncertainty about the state of that system.

The themes of thermodynamic connections to statistical mechanics had been explored in the preceding decades with Clausius, Maxwell, and Boltzmann, together writing thousands of pages on this topic. [2] One of Gibbs' aims in writing the book was to distill these results into a cohesive and simple picture. Gibbs wrote in 1892 to his colleague Lord Rayleigh

Just now I am trying to get ready for publication something on thermodynamics from the a-priori point of view, or rather on 'statistical mechanics' [...] I do not know that I shall have anything particularly new in substance, but shall be contented if I can so choose my standpoint (as seems to me possible) as to get a simpler view of the subject." [2]

He had been working on this topic for some time, at least as early as 1884 when he produced a paper (now lost except for its abstract) on the topic of statistical mechanics. [3]

Gibbs' book simplified statistical mechanics into a treatise of 207 pages. At the same time, Gibbs fully generalized and expanded statistical mechanics into the form in which it is known today. Gibbs showed how statistical mechanics could be used even to extend thermodynamics beyond classical thermodynamics, to systems of any number of degrees of freedom (including microscopic systems) and non-extensive systems.

At the time of the book's writing, the prevailing understanding of nature was purely in classical terms: Quantum mechanics had not yet been conceived, and even basic facts taken for granted today (such as the existence of atoms) were still contested among scientists. Gibbs was careful in assuming the least about the nature of physical systems under study, and as a result the principles of statistical mechanics laid down by Gibbs have retained their accuracy (with some changes in detail but not in theme), in spite of the major upheavals of modern physics during the early 20th century. [4]

Content

V. Kumaran wrote the following comment regarding Elementary Principles in Statistical Mechanics:

... In this, he introduced the now standard concept of ‘ensemble’, which is a collection of a large number of indistinguishable replicas of the system under consideration, which interact with each other, but which are isolated from the rest of the universe. The replicas could be in different microscopic states, as determined by the positions and momenta of the constituent molecules, for example, but the macroscopic state determined by the pressure, temperature and / or other thermodynamic variables are identical.

Gibbs argued that the properties of the system, averaged over time, is identical to an average over all the members of the ensemble if the ‘ergodic hypothesis’ is valid. The ergodic hypothesis, which states that all the microstates of the system are sampled with equal probability, is applicable to most systems, with the exception of systems such as quenched glasses which are in metastable states. Thus, the ensemble averaging method provides us an easy way to calculate the thermodynamic properties of the system, without having to observe it for long periods of time.

Gibbs also used this tool to obtain relationships between systems constrained in different ways, for example, to relate the properties of a system at constant volume and energy with those at constant temperature and pressure. Even today, the concept of ensemble is widely used for sampling in computer simulations of the thermodynamic properties of materials, and has subsequently found uses in other fields such as quantum theory. [5]

Related Research Articles

<span class="mw-page-title-main">Boltzmann distribution</span> Probability distribution of energy states of a system

In statistical mechanics and mathematics, a Boltzmann distribution is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Physical chemistry</span> Physics applied to chemical systems

Physical chemistry is the study of macroscopic and microscopic phenomena in chemical systems in terms of the principles, practices, and concepts of physics such as motion, energy, force, time, thermodynamics, quantum chemistry, statistical mechanics, analytical dynamics and chemical equilibria.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Thermodynamics</span> Physics of heat, work, and temperature

Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of thermodynamics which convey a quantitative description using measurable macroscopic physical quantities, but may be explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering, especially physical chemistry, biochemistry, chemical engineering and mechanical engineering, but also in other complex fields such as meteorology.

<span class="mw-page-title-main">Josiah Willard Gibbs</span> American scientist (1839–1903)

Josiah Willard Gibbs was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in transforming physical chemistry into a rigorous inductive science. Together with James Clerk Maxwell and Ludwig Boltzmann, he created statistical mechanics, explaining the laws of thermodynamics as consequences of the statistical properties of ensembles of the possible states of a physical system composed of many particles. Gibbs also worked on the application of Maxwell's equations to problems in physical optics. As a mathematician, he invented modern vector calculus.

<span class="mw-page-title-main">Timeline of thermodynamics</span>

A timeline of events in the history of thermodynamics.

In physics, specifically statistical mechanics, an ensemble is an idealization consisting of a large number of virtual copies of a system, considered all at once, each of which represents a possible state that the real system might be in. In other words, a statistical ensemble is a set of systems of particles used in statistical mechanics to describe a single system. The concept of an ensemble was introduced by J. Willard Gibbs in 1902.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.

In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat bath, so that the states of the system will differ in total energy.

In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.

In probability theory and statistical mechanics, a Gibbs state is an equilibrium probability distribution which remains invariant under future evolution of the system. For example, a stationary or steady-state distribution of a Markov chain, such as that achieved by running a Markov chain Monte Carlo iteration for a sufficiently long time, is a Gibbs state.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of its thermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume and density. Treatments on statistical mechanics define a macrostate as follows: a particular set of values of energy, the number of particles, and the volume of an isolated thermodynamic system is said to specify a particular macrostate of it. In this description, microstates appear as different possible ways the system can achieve a particular macrostate.

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

<span class="mw-page-title-main">Boltzmann's entropy formula</span> Equation in statistical mechanics

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

<span class="mw-page-title-main">Lloyd Demetrius</span>

Lloyd A. Demetrius is an American mathematician and theoretical biologist at the Department of Organismic and Evolutionary biology, Harvard University. He is best known for the discovery of the concept evolutionary entropy, a statistical parameter that characterizes Darwinian fitness in models of evolutionary processes at various levels of biological organization – molecular, organismic and social. Evolutionary entropy, a generalization of the Gibbs-Boltzmann entropy in statistical thermodynamics, is the cornerstone of directionality theory, an analytical study of evolution by variation and selection. The theory has applications to: a) the development of aging and the evolution of longevity; b) the origin and progression of age related diseases such as cancer, and neurodegenerative disorders such as Alzheimer's disease and Parkinson's disease; c) the evolution of cooperation and the spread of inequality.

<span class="mw-page-title-main">Temperature</span> Physical quantity that expresses hot and cold

Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measured with a thermometer.

References

  1. Hadamard, Jacques (1906). "Review of Elementary Principles in Statistical Mechanics, Developed with especial Reference to the Rational Foundations of Thermodynamics by J. Willard Gibbs" (PDF). Bull. Amer. Math. Soc. 12 (4): 194–210. doi: 10.1090/s0002-9904-1906-01319-2 .(in French)
  2. 1 2 Cercignani, Carlo (1998). Ludwig Boltzmann: The Man Who Trusted Atoms . Oxford University Press. ISBN   9780198501541.
  3. Gibbs, J.W. (1884). "On the Fundamental Formula of Statistical Mechanics, with Applications to Astronomy and Thermodynamics". Proceedings of the American Association for the Advancement of Science. 33: 57–58.
    Gibbs' original article is reproduced in
    "[Abstract] On the Fundamental Formula of Statistical Mechanics, with Applications to Astronomy and Thermodynamics". The Scientific Papers of J. Willard Gibbs. Vol. II. 1906. p. 16.
  4. Tolman, R.C. (1938). The Principles of Statistical Mechanics. Dover Publications. ISBN   9780486638966.
  5. Kumaran, V. (July 2007). "Josiah Willard Gibbs". Resonance. 12 (7): 4–11. doi:10.1007/s12045-007-0069-3. S2CID   121497834.