Statistical mechanics |
---|
In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. [1] The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that (by conservation of energy) the energy of the system does not change with time.
The primary macroscopic variables of the microcanonical ensemble are the total number of particles in the system (symbol: N), the system's volume (symbol: V), as well as the total energy in the system (symbol: E). Each of these is assumed to be constant in the ensemble. For this reason, the microcanonical ensemble is sometimes called the NVE ensemble.
In simple terms, the microcanonical ensemble is defined by assigning an equal probability to every microstate whose energy falls within a range centered at E. All other microstates are given a probability of zero. Since the probabilities must add up to 1, the probability P is the inverse of the number of microstates W within the range of energy,
The range of energy is then reduced in width until it is infinitesimally narrow, still centered at E. In the limit of this process, the microcanonical ensemble is obtained. [1]
Because of its connection with the elementary assumptions of equilibrium statistical mechanics (particularly the postulate of a priori equal probabilities), the microcanonical ensemble is an important conceptual building block in the theory. [2] It is sometimes considered to be the fundamental distribution of equilibrium statistical mechanics. It is also useful in some numerical applications, such as molecular dynamics. [3] [4] On the other hand, most nontrivial systems are mathematically cumbersome to describe in the microcanonical ensemble, and there are also ambiguities regarding the definitions of entropy and temperature. For these reasons, other ensembles are often preferred for theoretical calculations. [2] [5] [6]
The applicability of the microcanonical ensemble to real-world systems depends on the importance of energy fluctuations, which may result from interactions between the system and its environment as well as uncontrolled factors in preparing the system. Generally, fluctuations are negligible if a system is macroscopically large, or if it is manufactured with precisely known energy and thereafter maintained in near isolation from its environment. [7] In such cases the microcanonical ensemble is applicable. Otherwise, different ensembles are more appropriate – such as the canonical ensemble (fluctuating energy) or the grand canonical ensemble (fluctuating energy and particle number).
The fundamental thermodynamic potential of the microcanonical ensemble is entropy. There are at least three possible definitions, each given in terms of the phase volume function v(E). In classical mechanics v(E) this is the volume of the region of phase space where the energy is less than E. In quantum mechanics v(E) is roughly the number of energy eigenstates with energy less than E; however this must be smoothed so that we can take its derivative (see the Precise expressions section for details on how this is done). The definitions of microcanonical entropy are:
In the microcanonical ensemble, the temperature is a derived quantity rather than an external control parameter. It is defined as the derivative of the chosen entropy with respect to energy. [8] For example, one can define the "temperatures" Tv and Ts as follows:
Like entropy, there are multiple ways to understand temperature in the microcanonical ensemble. More generally, the correspondence between these ensemble-based definitions and their thermodynamic counterparts is not perfect, particularly for finite systems.
The microcanonical pressure and chemical potential are given by: [9]
Under their strict definition, phase transitions correspond to nonanalytic behavior in the thermodynamic potential or its derivatives. [10] Using this definition, phase transitions in the microcanonical ensemble can occur in systems of any size. This contrasts with the canonical and grand canonical ensembles, for which phase transitions can occur only in the thermodynamic limit – i.e., in systems with infinitely many degrees of freedom. [10] [11] Roughly speaking, the reservoirs defining the canonical or grand canonical ensembles introduce fluctuations that "smooth out" any nonanalytic behavior in the free energy of finite systems. This smoothing effect is usually negligible in macroscopic systems, which are sufficiently large that the free energy can approximate nonanalytic behavior exceedingly well. However, the technical difference in ensembles may be important in the theoretical analysis of small systems. [11]
For a given mechanical system (fixed N, V) and a given range of energy, the uniform distribution of probability P over microstates (as in the microcanonical ensemble) maximizes the ensemble average −⟨log P⟩. [1]
Early work in statistical mechanics by Ludwig Boltzmann led to his eponymous entropy equation for a system of a given total energy, S = k log W, where W is the number of distinct states accessible by the system at that energy. Boltzmann did not elaborate too deeply on what exactly constitutes the set of distinct states of a system, besides the special case of an ideal gas. This topic was investigated to completion by Josiah Willard Gibbs who developed the generalized statistical mechanics for arbitrary mechanical systems, and defined the microcanonical ensemble described in this article. [1] Gibbs investigated carefully the analogies between the microcanonical ensemble and thermodynamics, especially how they break down in the case of systems of few degrees of freedom. He introduced two further definitions of microcanonical entropy that do not depend on ω – the volume and surface entropy described above. (Note that the surface entropy differs from the Boltzmann entropy only by an ω-dependent offset.)
The volume entropy and associated temperature are closely analogous to thermodynamic entropy and temperature. It is possible to show exactly that
(⟨P⟩ is the ensemble average pressure) as expected for the first law of thermodynamics. A similar equation can be found for the surface entropy (or Boltzmann entropy ) and its associated temperature Ts, however the "pressure" in this equation is a complicated quantity unrelated to the average pressure. [1]
The microcanonical temperatures and are not entirely satisfactory in their analogy to temperature as defined using a canonical ensemble. Outside of the thermodynamic limit, a number of artefacts occur.
The preferred solution to these problems is avoid use of the microcanonical ensemble. In many realistic cases a system is thermostatted to a heat bath so that the energy is not precisely known. Then, a more accurate description is the canonical ensemble or grand canonical ensemble, both of which have complete correspondence to thermodynamics. [14]
The precise mathematical expression for a statistical ensemble depends on the kind of mechanics under consideration – quantum or classical – since the notion of a "microstate" is considerably different in these two cases. In quantum mechanics, diagonalization provides a discrete set of microstates with specific energies. The classical mechanical case involves instead an integral over canonical phase space, and the size of microstates in phase space can be chosen somewhat arbitrarily.
To construct the microcanonical ensemble, it is necessary in both types of mechanics to first specify a range of energy. In the expressions below the function (a function of H, peaking at E with width ω) will be used to represent the range of energy in which to include states. An example of this function would be [1]
or, more smoothly,
A statistical ensemble in quantum mechanics is represented by a density matrix, denoted by . The microcanonical ensemble can be written using bra–ket notation, in terms of the system's energy eigenstates and energy eigenvalues. Given a complete basis of energy eigenstates |ψi⟩, indexed by i, the microcanonical ensemble is[ citation needed ]
where the Hi are the energy eigenvalues determined by (here Ĥ is the system's total energy operator, i. e., Hamiltonian operator). The value of W is determined by demanding that is a normalized density matrix, and so
The state volume function (used to calculate entropy) is given by
The microcanonical ensemble is defined by taking the limit of the density matrix as the energy width goes to zero, however a problematic situation occurs once the energy width becomes smaller than the spacing between energy levels. For very small energy width, the ensemble does not exist at all for most values of E, since no states fall within the range. When the ensemble does exist, it typically only contains one (or two) states, since in a complex system the energy levels are only ever equal by accident (see random matrix theory for more discussion on this point). Moreover, the state-volume function also increases only in discrete increments, and so its derivative is only ever infinite or zero, making it difficult to define the density of states. This problem can be solved by not taking the energy range completely to zero and smoothing the state-volume function, however this makes the definition of the ensemble more complicated, since it becomes then necessary to specify the energy range in addition to other variables (together, an NVEω ensemble).
In classical mechanics, an ensemble is represented by a joint probability density function ρ(p1, ... pn, q1, ... qn) defined over the system's phase space. [1] The phase space has n generalized coordinates called q1, ... qn, and n associated canonical momenta called p1, ... pn.
The probability density function for the microcanonical ensemble is:
where
Again, the value of W is determined by demanding that ρ is a normalized probability density function:
This integral is taken over the entire phase space. The state volume function (used to calculate entropy) is defined by
As the energy width ω is taken to zero, the value of W decreases in proportion to ω as W = ω (dv/dE).
Based on the above definition, the microcanonical ensemble can be visualized as an infinitesimally thin shell in phase space, centered on a constant-energy surface. Although the microcanonical ensemble is confined to this surface, it is not necessarily uniformly distributed over that surface: if the gradient of energy in phase space varies, then the microcanonical ensemble is "thicker" (more concentrated) in some parts of the surface than others. This feature is an unavoidable consequence of requiring that the microcanonical ensemble is a steady-state ensemble.
The fundamental quantity in the microcanonical ensemble is , which is equal to the phase space volume compatible with given . From , all thermodynamic quantities can be calculated. For an ideal gas, the energy is independent of the particle positions, which therefore contribute a factor of to . The momenta, by contrast, are constrained to a -dimensional (hyper-)spherical shell of radius ; their contribution is equal to the surface volume of this shell. The resulting expression for is: [15]
where is the gamma function, and the factor has been included to account for the indistinguishability of particles (see Gibbs paradox). In the large limit, the Boltzmann entropy is
This is also known as the Sackur–Tetrode equation.
The temperature is given by
which agrees with analogous result from the kinetic theory of gases. Calculating the pressure gives the ideal gas law:
Finally, the chemical potential is
The microcanonical phase volume can also be calculated explicitly for an ideal gas in a uniform gravitational field. [16]
The results are stated below for a 3-dimensional ideal gas of particles, each with mass , confined in a thermally isolated container that is infinitely long in the z-direction and has constant cross-sectional area . The gravitational field is assumed to act in the minus z direction with strength . The phase volume is
where is the total energy, kinetic plus gravitational.
The gas density as a function of height can be obtained by integrating over the phase volume coordinates. The result is:
Similarly, the distribution of the velocity magnitude (averaged over all heights) is
The analogues of these equations in the canonical ensemble are the barometric formula and the Maxwell–Boltzmann distribution, respectively. In the limit , the microcanonical and canonical expressions coincide; however, they differ for finite . In particular, in the microcanonical ensemble, the positions and velocities are not statistically independent. As a result, the kinetic temperature, defined as the average kinetic energy in a given volume , is nonuniform throughout the container:
By contrast, the temperature is uniform in the canonical ensemble, for any . [17]
In statistical mechanics and mathematics, a Boltzmann distribution is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:
In physics, specifically statistical mechanics, an ensemble is an idealization consisting of a large number of virtual copies of a system, considered all at once, each of which represents a possible state that the real system might be in. In other words, a statistical ensemble is a set of systems of particles used in statistical mechanics to describe a single system. The concept of an ensemble was introduced by J. Willard Gibbs in 1902.
In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.
In thermodynamics, the Helmholtz free energy is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature (isothermal). The change in the Helmholtz energy during a process is equal to the maximum amount of work that the system can perform in a thermodynamic process in which temperature is held constant. At constant temperature, the Helmholtz free energy is minimized at equilibrium.
In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.
Certain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero. A system with a truly negative temperature on the Kelvin scale is hotter than any system with a positive temperature. If a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system. A standard example of such a system is population inversion in laser physics.
In classical statistical mechanics, the equipartition theorem relates the temperature of a system to its average energies. The equipartition theorem is also known as the law of equipartition, equipartition of energy, or simply equipartition. The original idea of equipartition was that, in thermal equilibrium, energy is shared equally among all of its various forms; for example, the average kinetic energy per degree of freedom in translational motion of a molecule should equal that in rotational motion.
In statistical mechanics, the grand canonical ensemble is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibrium with a reservoir. The system is said to be open in the sense that the system can exchange energy and particles with a reservoir, so that various possible states of the system can differ in both their total energy and total number of particles. The system's volume, shape, and other external coordinates are kept the same in all possible states of the system.
In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat bath, so that the states of the system will differ in total energy.
A property of a physical system, such as the entropy of a gas, that stays approximately constant when changes occur slowly is called an adiabatic invariant. By this it is meant that if a system is varied between two end points, as the time for the variation between the end points is increased to infinity, the variation of an adiabatic invariant between the two end points goes to zero.
In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system:.
In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
The isothermal–isobaric ensemble is a statistical mechanical ensemble that maintains constant temperature and constant pressure applied. It is also called the -ensemble, where the number of particles is also kept as a constant. This ensemble plays an important role in chemistry as chemical reactions are usually carried out under constant pressure condition. The NPT ensemble is also useful for measuring the equation of state of model systems whose virial expansion for pressure cannot be evaluated, or systems near first-order phase transitions.
The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.
In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like G or H (enthalpy). The relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.
In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:
The Nosé–Hoover thermostat is a deterministic algorithm for constant-temperature molecular dynamics simulations. It was originally developed by Nosé and was improved further by Hoover. Although the heat bath of Nosé–Hoover thermostat consists of only one imaginary particle, simulation systems achieve realistic constant-temperature condition. Therefore, the Nosé–Hoover thermostat has been commonly used as one of the most accurate and efficient methods for constant-temperature molecular dynamics simulations.
In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.
The Gibbs rotational ensemble represents the possible states of a mechanical system in thermal and rotational equilibrium at temperature and angular velocity . The Jaynes procedure can be used to obtain this ensemble. An ensemble is the set of microstates corresponding to a given macrostate.