Boltzmann distribution

Last updated

Boltzmann's distribution is an exponential distribution. Exponential probability density.svg
Boltzmann's distribution is an exponential distribution.
Boltzmann factor
p
i
p
j
{\displaystyle {\tfrac {p_{i}}{p_{j}}}}
(vertical axis) as a function of temperature T for several energy differences ei - ej. Boltzmann distribution graph.svg
Boltzmann factor (vertical axis) as a function of temperature T for several energy differences εiεj.

In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution [1] ) is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:

Contents

where pi is the probability of the system being in state i, exp is the exponential function, εi is the energy of that state, and a constant kT of the distribution is the product of the Boltzmann constant k and thermodynamic temperature T. The symbol denotes proportionality (see § The distribution for the proportionality constant).

The term system here has a wide meaning; it can range from a collection of 'sufficient number' of atoms or a single atom [1] to a macroscopic system such as a natural gas storage tank. Therefore the Boltzmann distribution can be used to solve a wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied.

The ratio of probabilities of two states is known as the Boltzmann factor and characteristically only depends on the states' energy difference:

The Boltzmann distribution is named after Ludwig Boltzmann who first formulated it in 1868 during his studies of the statistical mechanics of gases in thermal equilibrium. [2] Boltzmann's statistical work is borne out in his paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium" [3] The distribution was later investigated extensively, in its modern generic form, by Josiah Willard Gibbs in 1902. [4]

The Boltzmann distribution should not be confused with the Maxwell–Boltzmann distribution or Maxwell-Boltzmann statistics. The Boltzmann distribution gives the probability that a system will be in a certain state as a function of that state's energy, [5] while the Maxwell-Boltzmann distributions give the probabilities of particle speeds or energies in ideal gases. The distribution of energies in a one-dimensional gas however, does follow the Boltzmann distribution.

The distribution

The Boltzmann distribution is a probability distribution that gives the probability of a certain state as a function of that state's energy and temperature of the system to which the distribution is applied. [6] It is given as

where:

Using Lagrange multipliers, one can prove that the Boltzmann distribution is the distribution that maximizes the entropy

subject to the normalization constraint that and the constraint that equals a particular mean energy value.

The partition function can be calculated if we know the energies of the states accessible to the system of interest. For atoms the partition function values can be found in the NIST Atomic Spectra Database. [7]

The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy. It can also give us the quantitative relationship between the probabilities of the two states being occupied. The ratio of probabilities for states i and j is given as

where:

The corresponding ratio of populations of energy levels must also take their degeneracies into account.

The Boltzmann distribution is often used to describe the distribution of particles, such as atoms or molecules, over bound states accessible to them. If we have a system consisting of many particles, the probability of a particle being in state i is practically the probability that, if we pick a random particle from that system and check what state it is in, we will find it is in state i. This probability is equal to the number of particles in state i divided by the total number of particles in the system, that is the fraction of particles that occupy state i.

where Ni is the number of particles in state i and N is the total number of particles in the system. We may use the Boltzmann distribution to find this probability that is, as we have seen, equal to the fraction of particles that are in state i. So the equation that gives the fraction of particles in state i as a function of the energy of that state is [5]

This equation is of great importance to spectroscopy. In spectroscopy we observe a spectral line of atoms or molecules undergoing transitions from one state to another. [5] [8] In order for this to be possible, there must be some particles in the first state to undergo the transition. We may find that this condition is fulfilled by finding the fraction of particles in the first state. If it is negligible, the transition is very likely not observed at the temperature for which the calculation was done. In general, a larger fraction of molecules in the first state means a higher number of transitions to the second state. [9] This gives a stronger spectral line. However, there are other factors that influence the intensity of a spectral line, such as whether it is caused by an allowed or a forbidden transition.

The softmax function commonly used in machine learning is related to the Boltzmann distribution:

Generalized Boltzmann distribution

Distribution of the form

is called generalized Boltzmann distribution by some authors. [10]

The Boltzmann distribution is a special case of the generalized Boltzmann distribution. The generalized Boltzmann distribution is used in statistical mechanics to describe canonical ensemble, grand canonical ensemble and isothermal–isobaric ensemble. The generalized Boltzmann distribution is usually derived from the principle of maximum entropy, but there are other derivations. [10] [11]

The generalized Boltzmann distribution has the following properties:

In statistical mechanics

The Boltzmann distribution appears in statistical mechanics when considering closed systems of fixed composition that are in thermal equilibrium (equilibrium with respect to energy exchange). The most general case is the probability distribution for the canonical ensemble. Some special cases (derivable from the canonical ensemble) show the Boltzmann distribution in different aspects:

Canonical ensemble (general case)
The canonical ensemble gives the probabilities of the various possible states of a closed system of fixed volume, in thermal equilibrium with a heat bath. The canonical ensemble has a state probability distribution with the Boltzmann form.
Statistical frequencies of subsystems' states (in a non-interacting collection)
When the system of interest is a collection of many non-interacting copies of a smaller subsystem, it is sometimes useful to find the statistical frequency of a given subsystem state, among the collection. The canonical ensemble has the property of separability when applied to such a collection: as long as the non-interacting subsystems have fixed composition, then each subsystem's state is independent of the others and is also characterized by a canonical ensemble. As a result, the expected statistical frequency distribution of subsystem states has the Boltzmann form.
Maxwell–Boltzmann statistics of classical gases (systems of non-interacting particles)
In particle systems, many particles share the same space and regularly change places with each other; the single-particle state space they occupy is a shared space. Maxwell–Boltzmann statistics give the expected number of particles found in a given single-particle state, in a classical gas of non-interacting particles at equilibrium. This expected number distribution has the Boltzmann form.

Although these cases have strong similarities, it is helpful to distinguish them as they generalize in different ways when the crucial assumptions are changed:

In mathematics

In economics

The Boltzmann distribution can be introduced to allocate permits in emissions trading. [13] [14] The new allocation method using the Boltzmann distribution can describe the most probable, natural, and unbiased distribution of emissions permits among multiple countries.

The Boltzmann distribution has the same form as the multinomial logit model. As a discrete choice model, this is very well known in economics since Daniel McFadden made the connection to random utility maximization. [15]

See also

Related Research Articles

<span class="mw-page-title-main">Maxwell–Boltzmann distribution</span> Specific probability distribution function, important in physics

In physics, the Maxwell–Boltzmann distribution, or Maxwell(ian) distribution, is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann.

In physics, specifically statistical mechanics, an ensemble is an idealization consisting of a large number of virtual copies of a system, considered all at once, each of which represents a possible state that the real system might be in. In other words, a statistical ensemble is a set of systems of particles used in statistical mechanics to describe a single system. The concept of an ensemble was introduced by J. Willard Gibbs in 1902.

<span class="mw-page-title-main">Fermi–Dirac statistics</span> Statistical description for the behavior of fermions

Fermi–Dirac statistics is a type of quantum statistics that applies to the physics of a system consisting of many non-interacting, identical particles that obey the Pauli exclusion principle. A result is the Fermi–Dirac distribution of particles over energy states. It is named after Enrico Fermi and Paul Dirac, each of whom derived the distribution independently in 1926. Fermi–Dirac statistics is a part of the field of statistical mechanics and uses the principles of quantum mechanics.

<span class="mw-page-title-main">Maxwell–Boltzmann statistics</span> Statistical distribution used in many-particle mechanics

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.

<span class="mw-page-title-main">Bose–Einstein statistics</span> Description of the behavior of bosons

In quantum statistics, Bose–Einstein statistics describes one of two possible ways in which a collection of non-interacting identical particles may occupy a set of available discrete energy states at thermodynamic equilibrium. The aggregation of particles in the same state, which is a characteristic of particles obeying Bose–Einstein statistics, accounts for the cohesive streaming of laser light and the frictionless creeping of superfluid helium. The theory of this behaviour was developed (1924–25) by Satyendra Nath Bose, who recognized that a collection of identical and indistinguishable particles can be distributed in this way. The idea was later adopted and extended by Albert Einstein in collaboration with Bose.

<span class="mw-page-title-main">Helmholtz free energy</span> Thermodynamic potential

In thermodynamics, the Helmholtz free energy is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature (isothermal). The change in the Helmholtz energy during a process is equal to the maximum amount of work that the system can perform in a thermodynamic process in which temperature is held constant. At constant temperature, the Helmholtz free energy is minimized at equilibrium.

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

In statistical mechanics, the grand canonical ensemble is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibrium with a reservoir. The system is said to be open in the sense that the system can exchange energy and particles with a reservoir, so that various possible states of the system can differ in both their total energy and total number of particles. The system's volume, shape, and other external coordinates are kept the same in all possible states of the system.

In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.

In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat bath, so that the states of the system will differ in total energy.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

In mathematics, the Gibbs measure, named after Josiah Willard Gibbs, is a probability measure frequently seen in many problems of probability theory and statistical mechanics. It is a generalization of the canonical ensemble to infinite systems. The canonical ensemble gives the probability of the system X being in state x as

The isothermal–isobaric ensemble is a statistical mechanical ensemble that maintains constant temperature and constant pressure applied. It is also called the -ensemble, where the number of particles is also kept as a constant. This ensemble plays an important role in chemistry as chemical reactions are usually carried out under constant pressure condition. The NPT ensemble is also useful for measuring the equation of state of model systems whose virial expansion for pressure cannot be evaluated, or systems near first-order phase transitions.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

<span class="mw-page-title-main">Fundamental thermodynamic relation</span>

In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like G or H (enthalpy). The relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.

The Crooks fluctuation theorem (CFT), sometimes known as the Crooks equation, is an equation in statistical mechanics that relates the work done on a system during a non-equilibrium transformation to the free energy difference between the final and the initial state of the transformation. During the non-equilibrium transformation the system is at constant volume and in contact with a heat reservoir. The CFT is named after the chemist Gavin E. Crooks (then at University of California, Berkeley) who discovered it in 1998.

The partition function or configuration integral, as used in probability theory, information theory and dynamical systems, is a generalization of the definition of a partition function in statistical mechanics. It is a special case of a normalizing constant in probability theory, for the Boltzmann distribution. The partition function occurs in many problems of probability theory because, in situations where there is a natural symmetry, its associated probability measure, the Gibbs measure, has the Markov property. This means that the partition function occurs not only in physical systems with translation symmetry, but also in such varied settings as neural networks, and applications such as genomics, corpus linguistics and artificial intelligence, which employ Markov networks, and Markov logic networks. The Gibbs measure is also the unique measure that has the property of maximizing the entropy for a fixed expectation value of the energy; this underlies the appearance of the partition function in maximum entropy methods and the algorithms derived therefrom.

<span class="mw-page-title-main">Thermal fluctuations</span> Random temperature-influenced deviations of particles from their average state

In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.

The Gibbs rotational ensemble represents the possible states of a mechanical system in thermal and rotational equilibrium at temperature and angular velocity . The Jaynes procedure can be used to obtain this ensemble. An ensemble is the set of microstates corresponding to a given macrostate.

References

  1. 1 2 Landau, Lev Davidovich & Lifshitz, Evgeny Mikhailovich (1980) [1976]. Statistical Physics. Course of Theoretical Physics. Vol. 5 (3 ed.). Oxford: Pergamon Press. ISBN   0-7506-3372-7. Translated by J.B. Sykes and M.J. Kearsley. See section 28
  2. Boltzmann, Ludwig (1868). "Studien über das Gleichgewicht der lebendigen Kraft zwischen bewegten materiellen Punkten" [Studies on the balance of living force between moving material points]. Wiener Berichte. 58: 517–560.
  3. "Archived copy" (PDF). Archived from the original (PDF) on 2021-03-05. Retrieved 2017-05-11.{{cite web}}: CS1 maint: archived copy as title (link)
  4. Gibbs, Josiah Willard (1902). Elementary Principles in Statistical Mechanics. New York: Charles Scribner's Sons.
  5. 1 2 3 4 Atkins, P. W. (2010) Quanta, W. H. Freeman and Company, New York
  6. 1 2 McQuarrie, A. (2000). Statistical Mechanics. Sausalito, CA: University Science Books. ISBN   1-891389-15-7.
  7. NIST Atomic Spectra Database Levels Form at nist.gov
  8. Atkins, P. W.; de Paula, J. (2009). Physical Chemistry (9th ed.). Oxford: Oxford University Press. ISBN   978-0-19-954337-3.
  9. Skoog, D. A.; Holler, F. J.; Crouch, S. R. (2006). Principles of Instrumental Analysis. Boston, MA: Brooks/Cole. ISBN   978-0-495-12570-9.
  10. 1 2 3 Gao, Xiang; Gallicchio, Emilio; Roitberg, Adrian (2019). "The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy". The Journal of Chemical Physics. 151 (3): 034113. arXiv: 1903.02121 . Bibcode:2019JChPh.151c4113G. doi:10.1063/1.5111333. PMID   31325924. S2CID   118981017.
  11. 1 2 Gao, Xiang (March 2022). "The Mathematics of the Ensemble Theory". Results in Physics. 34: 105230. arXiv: 2006.00485 . Bibcode:2022ResPh..3405230G. doi:10.1016/j.rinp.2022.105230. S2CID   221978379.
  12. A classic example of this is magnetic ordering. Systems of non-interacting spins show paramagnetic behaviour that can be understood with a single-particle canonical ensemble (resulting in the Brillouin function). Systems of interacting spins can show much more complex behaviour such as ferromagnetism or antiferromagnetism.
  13. Park, J.-W., Kim, C. U. and Isard, W. (2012) Permit allocation in emissions trading using the Boltzmann distribution. Physica A 391: 4883–4890
  14. The Thorny Problem Of Fair Allocation. Technology Review blog. August 17, 2011. Cites and summarizes Park, Kim and Isard (2012).
  15. Amemiya, Takeshi (1985). "Multinomial Logit Model". Advanced Econometrics. Oxford: Basil Blackwell. pp. 295–299. ISBN   0-631-13345-3.