Sackur–Tetrode equation

Last updated • 2 min readFrom Wikipedia, The Free Encyclopedia

The Sackur–Tetrode equation is an expression for the entropy of a monatomic ideal gas. [1]

Contents

It is named for Hugo Martin Tetrode [2] (1895–1931) and Otto Sackur [3] (1880–1914), who developed it independently as a solution of Boltzmann's gas statistics and entropy equations, at about the same time in 1912. [4]

Formula

The Sackur–Tetrode equation expresses the entropy of a monatomic ideal gas in terms of its thermodynamic state—specifically, its volume , internal energy , and the number of particles : [1] [4]

where is the Boltzmann constant, is the mass of a gas particle and is the Planck constant.

The equation can also be expressed in terms of the thermal wavelength :

Entropy vs temperature curves of classical and quantum ideal gases (Fermi gas, Bose gas) in three dimensions. Though all are in close agreement at high temperature, they disagree at low temperatures where the classical entropy (Sackur-Tetrode equation) starts to approach negative values. Quantum ideal gas entropy 3d.svg
Entropy vs temperature curves of classical and quantum ideal gases (Fermi gas, Bose gas) in three dimensions. Though all are in close agreement at high temperature, they disagree at low temperatures where the classical entropy (Sackur–Tetrode equation) starts to approach negative values.

For a derivation of the Sackur–Tetrode equation, see the Gibbs paradox. For the constraints placed upon the entropy of an ideal gas by thermodynamics alone, see the ideal gas article.

The above expressions assume that the gas is in the classical regime and is described by Maxwell–Boltzmann statistics (with "correct Boltzmann counting"). From the definition of the thermal wavelength, this means the Sackur–Tetrode equation is valid only when

The entropy predicted by the Sackur–Tetrode equation approaches negative infinity as the temperature approaches zero.

Sackur–Tetrode constant

The Sackur–Tetrode constant, written S0/R, is equal to S/kBN evaluated at a temperature of T = 1  kelvin, at standard pressure (100 kPa or 101.325 kPa, to be specified), for one mole of an ideal gas composed of particles of mass equal to the atomic mass constant (mu = 1.66053906892(52)×10−27 kg [5] ). Its 2018 CODATA recommended value is:

S0/R = −1.15170753706(45) for po = 100 kPa [6]
S0/R = −1.16487052358(45) for po = 101.325 kPa. [7]

Information-theoretic interpretation

In addition to the thermodynamic perspective of entropy, the tools of information theory can be used to provide an information perspective of entropy. In particular, it is possible to derive the Sackur–Tetrode equation in information-theoretic terms. The overall entropy is represented as the sum of four individual entropies, i.e., four distinct sources of missing information. These are positional uncertainty, momenta uncertainty, the quantum mechanical uncertainty principle, and the indistinguishability of the particles. [8] Summing the four pieces, the Sackur–Tetrode equation is then given as

The derivation uses Stirling's approximation, . Strictly speaking, the use of dimensioned arguments to the logarithms is incorrect, however their use is a "shortcut" made for simplicity. If each logarithmic argument were divided by an unspecified standard value expressed in terms of an unspecified standard mass, length and time, these standard values would cancel in the final result, yielding the same conclusion. The individual entropy terms will not be absolute, but will rather depend upon the standards chosen, and will differ with different standards by an additive constant.

Related Research Articles

<span class="mw-page-title-main">Boltzmann distribution</span> Probability distribution of energy states of a system

In statistical mechanics and mathematics, a Boltzmann distribution is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Boltzmann constant</span> Physical constant relating particle kinetic energy with temperature

The Boltzmann constant is the proportionality factor that relates the average relative thermal energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the gas constant, in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann constant has dimensions of energy divided by temperature, the same as entropy and heat capacity. It is named after the Austrian scientist Ludwig Boltzmann.

<span class="mw-page-title-main">Gas constant</span> Physical constant equivalent to the Boltzmann constant, but in different units

The molar gas constant is denoted by the symbol R or R. It is the molar equivalent to the Boltzmann constant, expressed in units of energy per temperature increment per amount of substance, rather than energy per temperature increment per particle. The constant is also a combination of the constants from Boyle's law, Charles's law, Avogadro's law, and Gay-Lussac's law. It is a physical constant that is featured in many fundamental equations in the physical sciences, such as the ideal gas law, the Arrhenius equation, and the Nernst equation.

<span class="mw-page-title-main">Ideal gas law</span> Equation of the state of a hypothetical ideal gas

The ideal gas law, also called the general gas equation, is the equation of state of a hypothetical ideal gas. It is a good approximation of the behavior of many gases under many conditions, although it has several limitations. It was first stated by Benoît Paul Émile Clapeyron in 1834 as a combination of the empirical Boyle's law, Charles's law, Avogadro's law, and Gay-Lussac's law. The ideal gas law is often written in an empirical form:

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

<span class="mw-page-title-main">Maxwell–Boltzmann statistics</span> Statistical distribution used in many-particle mechanics

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.

In statistical mechanics, a semi-classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive. This leads to a paradox known as the Gibbs paradox, after Josiah Willard Gibbs, who proposed this thought experiment in 1874‒1875. The paradox allows for the entropy of closed systems to decrease, violating the second law of thermodynamics. A related paradox is the "mixing paradox". If one takes the perspective that the definition of entropy must be changed so as to ignore particle permutation, in the thermodynamic limit, the paradox is averted.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of the statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

<span class="mw-page-title-main">Bose gas</span> State of matter of many bosons

An ideal Bose gas is a quantum-mechanical phase of matter, analogous to a classical ideal gas. It is composed of bosons, which have an integer value of spin and abide by Bose–Einstein statistics. The statistical mechanics of bosons were developed by Satyendra Nath Bose for a photon gas and extended to massive particles by Albert Einstein, who realized that an ideal gas of bosons would form a condensate at a low enough temperature, unlike a classical ideal gas. This condensate is known as a Bose–Einstein condensate.

In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.

The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.

The Loschmidt constant or Loschmidt's number (symbol: n0) is the number of particles (atoms or molecules) of an ideal gas per volume (the number density), and usually quoted at standard temperature and pressure. The 2018 CODATA recommended value is 2.686780111...×1025 m−3 at 0 °C and 1 atm. It is named after the Austrian physicist Johann Josef Loschmidt, who was the first to estimate the physical size of molecules in 1865. The term Loschmidt constant is also sometimes used to refer to the Avogadro constant, particularly in German texts.

In the history of physics, the concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and Nicolas-Joseph Cugnot's steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

The quantum concentrationnQ is the particle concentration of a system where the interparticle distance is equal to the thermal de Broglie wavelength.

The Planck constant, or Planck's constant, denoted by , is a fundamental physical constant of foundational importance in quantum mechanics: a photon's energy is equal to its frequency multiplied by the Planck constant, and the wavelength of a matter wave equals the Planck constant divided by the associated particle momentum. The closely related reduced Planck constant, equal to and denoted is commonly used in quantum physics equations.

Hugo Martin Tetrode was a Dutch theoretical physicist who contributed to statistical physics, early quantum theory and quantum mechanics.

References

  1. 1 2 Schroeder, Daniel V. (1999), An Introduction to Thermal Physics, Addison Wesley Longman, ISBN   0-201-38027-7
  2. H. Tetrode (1912) "Die chemische Konstante der Gase und das elementare Wirkungsquantum" (The chemical constant of gases and the elementary quantum of action), Annalen der Physik 38: 434–442. See also: H. Tetrode (1912) "Berichtigung zu meiner Arbeit: "Die chemische Konstante der Gase und das elementare Wirkungsquantum" " (Correction to my work: "The chemical constant of gases and the elementary quantum of action"), Annalen der Physik39: 255–256.
  3. Sackur published his findings in the following series of papers:
    1. O. Sackur (1911) "Die Anwendung der kinetischen Theorie der Gase auf chemische Probleme" (The application of the kinetic theory of gases to chemical problems), Annalen der Physik , 36: 958–980.
    2. O. Sackur, "Die Bedeutung des elementaren Wirkungsquantums für die Gastheorie und die Berechnung der chemischen Konstanten" (The significance of the elementary quantum of action to gas theory and the calculation of the chemical constant), Festschrift W. Nernst zu seinem 25jährigen Doktorjubiläum gewidmet von seinen Schülern (Halle an der Saale, Germany: Wilhelm Knapp, 1912), pages 405–423.
    3. O. Sackur (1913) "Die universelle Bedeutung des sog. elementaren Wirkungsquantums" (The universal significance of the so-called elementary quantum of action), Annalen der Physik 40: 67–86.
  4. 1 2 Grimus, Walter (2013). "100th anniversary of the Sackur–Tetrode equation". Annalen der Physik. 525 (3): A32–A35. arXiv: 1112.3748 . Bibcode:2013AnP...525A..32G. doi: 10.1002/andp.201300720 . ISSN   0003-3804.
  5. "2022 CODATA Value: atomic mass constant". The NIST Reference on Constants, Units, and Uncertainty. NIST. May 2024. Retrieved 2024-05-18.
  6. "2018 CODATA Value: Sackur–Tetrode constant". The NIST Reference on Constants, Units, and Uncertainty. NIST. 20 May 2019. Retrieved 2019-05-20.
  7. "2018 CODATA Value: Sackur–Tetrode constant". The NIST Reference on Constants, Units, and Uncertainty. NIST. 20 May 2019. Retrieved 2019-05-20.
  8. Ben-Naim, Arieh (2008), A Farewell to Entropy: Statistical Thermodynamics Based on Information, World Scientific, ISBN   978-981-270-706-2 , retrieved 2017-12-12.

Further reading