Microstate (statistical mechanics)

Last updated
A diagram of the microstates and macrostates of flipping a coin twice. All microstates are equally probable, but the macrostates consists of states without ordering (H, T) is twice as probable as the macrostates with single states (H, H) and (T, T). Macrostates and microstates of two coins.svg
A diagram of the microstates and macrostates of flipping a coin twice. All microstates are equally probable, but the macrostates consists of states without ordering (H, T) is twice as probable as the macrostates with single states (H, H) and (T, T).

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

Contents

In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume and density. [1] Treatments on statistical mechanics [2] [3] define a macrostate as follows: a particular set of values of energy, the number of particles, and the volume of an isolated thermodynamic system is said to specify a particular macrostate of it. In this description, microstates appear as different possible ways the system can achieve a particular macrostate.

A macrostate is characterized by a probability distribution of possible states across a certain statistical ensemble of all microstates. This distribution describes the probability of finding the system in a certain microstate. In the thermodynamic limit, the microstates visited by a macroscopic system during its fluctuations all have the same macroscopic properties.

In a quantum system, the microstate is simply the value of the wave function. [4]

Microscopic definitions of thermodynamic concepts

Statistical mechanics links the empirical thermodynamic properties of a system to the statistical distribution of an ensemble of microstates. All macroscopic thermodynamic properties of a system may be calculated from the partition function that sums of all its microstates.

At any moment a system is distributed across an ensemble of microstates, each labeled by , and having a probability of occupation , and an energy . If the microstates are quantum-mechanical in nature, then these microstates form a discrete set as defined by quantum statistical mechanics, and is an energy level of the system.

Internal energy

The internal energy of the macrostate is the mean over all microstates of the system's energy

This is a microscopic statement of the notion of energy associated with the first law of thermodynamics.

Entropy

For the more general case of the canonical ensemble, the absolute entropy depends exclusively on the probabilities of the microstates and is defined as

where is the Boltzmann constant. For the microcanonical ensemble, consisting of only those microstates with energy equal to the energy of the macrostate, this simplifies to

with the number of microstates . This form for entropy appears on Ludwig Boltzmann's gravestone in Vienna.

The second law of thermodynamics describes how the entropy of an isolated system changes in time. The third law of thermodynamics is consistent with this definition, since zero entropy means that the macrostate of the system reduces to a single microstate.

Heat and work

Heat and work can be distinguished if we take the underlying quantum nature of the system into account.

For a closed system (no transfer of matter), heat in statistical mechanics is the energy transfer associated with a disordered, microscopic action on the system, associated with jumps in occupation numbers of the quantum energy levels of the system, without change in the values of the energy levels themselves. [2]

Work is the energy transfer associated with an ordered, macroscopic action on the system. If this action acts very slowly, then the adiabatic theorem of quantum mechanics implies that this will not cause jumps between energy levels of the system. In this case, the internal energy of the system only changes due to a change of the system's energy levels. [2]

The microscopic, quantum definitions of heat and work are the following:

so that

The two above definitions of heat and work are among the few expressions of statistical mechanics where the thermodynamic quantities defined in the quantum case find no analogous definition in the classical limit. The reason is that classical microstates are not defined in relation to a precise associated quantum microstate, which means that when work changes the total energy available for distribution among the classical microstates of the system, the energy levels (so to speak) of the microstates do not follow this change.

The microstate in phase space

Classical phase space

The description of a classical system of F degrees of freedom may be stated in terms of a 2F dimensional phase space, whose coordinate axes consist of the F generalized coordinates qi of the system, and its F generalized momenta pi. The microstate of such a system will be specified by a single point in the phase space. But for a system with a huge number of degrees of freedom its exact microstate usually is not important. So the phase space can be divided into cells of the size h0 = ΔqiΔpi, each treated as a microstate. Now the microstates are discrete and countable [5] and the internal energy U has no longer an exact value but is between U and U+δU, with .

The number of microstates Ω that a closed system can occupy is proportional to its phase space volume:

where is an Indicator function. It is 1 if the Hamilton function H(x) at the point x = (q,p) in phase space is between U and U+ δU and 0 if not. The constant makes Ω(U) dimensionless. For an ideal gas is . [6]

In this description, the particles are distinguishable. If the position and momentum of two particles are exchanged, the new state will be represented by a different point in phase space. In this case a single point will represent a microstate. If a subset of M particles are indistinguishable from each other, then the M! possible permutations or possible exchanges of these particles will be counted as part of a single microstate. The set of possible microstates are also reflected in the constraints upon the thermodynamic system.

For example, in the case of a simple gas of N particles with total energy U contained in a cube of volume V, in which a sample of the gas cannot be distinguished from any other sample by experimental means, a microstate will consist of the above-mentioned N! points in phase space, and the set of microstates will be constrained to have all position coordinates to lie inside the box, and the momenta to lie on a hyperspherical surface in momentum coordinates of radius U. If on the other hand, the system consists of a mixture of two different gases, samples of which can be distinguished from each other, say A and B, then the number of microstates is increased, since two points in which an A and B particle are exchanged in phase space are no longer part of the same microstate. Two particles that are identical may nevertheless be distinguishable based on, for example, their location. (See configurational entropy.) If the box contains identical particles, and is at equilibrium, and a partition is inserted, dividing the volume in half, particles in one box are now distinguishable from those in the second box. In phase space, the N/2 particles in each box are now restricted to a volume V/2, and their energy restricted to U/2, and the number of points describing a single microstate will change: the phase space description is not the same.

This has implications in both the Gibbs paradox and correct Boltzmann counting. With regard to Boltzmann counting, it is the multiplicity of points in phase space which effectively reduces the number of microstates and renders the entropy extensive. With regard to Gibbs paradox, the important result is that the increase in the number of microstates (and thus the increase in entropy) resulting from the insertion of the partition is exactly matched by the decrease in the number of microstates (and thus the decrease in entropy) resulting from the reduction in volume available to each particle, yielding a net entropy change of zero.

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects, unless energy in some form is supplied to reverse the direction of heat flow. Another definition is: "Not all heat energy can be converted into work in a cyclic process."

<span class="mw-page-title-main">Third law of thermodynamics</span> Law of physics

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.

In statistical mechanics, a semi-classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive. This leads to a paradox known as the Gibbs paradox, after Josiah Willard Gibbs, who proposed this thought experiment in 1874‒1875. The paradox allows for the entropy of closed systems to decrease, violating the second law of thermodynamics. A related paradox is the "mixing paradox". If one takes the perspective that the definition of entropy must be changed so as to ignore particle permutation, in the thermodynamic limit, the paradox is averted.

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

In physics, Liouville's theorem, named after the French mathematician Joseph Liouville, is a key theorem in classical statistical and Hamiltonian mechanics. It asserts that the phase-space distribution function is constant along the trajectories of the system—that is that the density of system points in the vicinity of a given system point traveling through phase-space is constant with time. This time-independent density is in statistical mechanics known as the classical a priori probability.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

In statistical mechanics, the grand canonical ensemble is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibrium with a reservoir. The system is said to be open in the sense that the system can exchange energy and particles with a reservoir, so that various possible states of the system can differ in both their total energy and total number of particles. The system's volume, shape, and other external coordinates are kept the same in all possible states of the system.

In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.

In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat bath, so that the states of the system will differ in total energy.

<span class="mw-page-title-main">Thermodynamic beta</span> Measure of the coldness of a system

In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system:

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

<span class="mw-page-title-main">Fundamental thermodynamic relation</span>

In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like G or H (enthalpy). The relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Table of thermodynamic equations</span>

Common thermodynamic equations and quantities in thermodynamics, using mathematical notation, are as follows:

In statistical mechanics, multiplicity refers to the number of microstates corresponding to a particular macrostate of a thermodynamic system. Commonly denoted , it is related to the configuration entropy of an isolated system via Boltzmann's entropy formula

<span class="mw-page-title-main">Boltzmann's entropy formula</span> Equation in statistical mechanics

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

<span class="mw-page-title-main">Thermal fluctuations</span> Random temperature-influenced deviations of particles from their average state

In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.

References

  1. Macrostates and Microstates Archived 2012-03-05 at the Wayback Machine
  2. 1 2 3 Reif, Frederick (1965). Fundamentals of Statistical and Thermal Physics. McGraw-Hill. pp. 66–70. ISBN   978-0-07-051800-1.
  3. Pathria, R K (1965). Statistical Mechanics. Butterworth-Heinemann. p. 10. ISBN   0-7506-2469-8.
  4. Eastman. "The Statistical Description of Physical Systems". Stanford. Retrieved 13 August 2023.
  5. "The Statistical Description of Physical Systems".
  6. Bartelmann, Matthias (2015). Theoretische Physik. Springer Spektrum. pp. 1142–1145. ISBN   978-3-642-54617-4.