Molecular chaos

Last updated

In the kinetic theory of gases in physics, the molecular chaos hypothesis (also called Stosszahlansatz in the writings of Paul Ehrenfest [1] [2] ) is the assumption that the velocities of colliding particles are uncorrelated, and independent of position. This means the probability that a pair of particles with given velocities will collide can be calculated by considering each particle separately and ignoring any correlation between the probability for finding one particle with velocity v and probability for finding another velocity v' in a small region δr. James Clerk Maxwell introduced this approximation in 1867 [3] although its origins can be traced back to his first work on the kinetic theory in 1860. [4] [5]

The assumption of molecular chaos is the key ingredient that allows proceeding from the BBGKY hierarchy to Boltzmann's equation, by reducing the 2-particle distribution function showing up in the collision term to a product of 1-particle distributions. This in turn leads to Boltzmann's H-theorem of 1872, [6] which attempted to use kinetic theory to show that the entropy of a gas prepared in a state of less than complete disorder must inevitably increase, as the gas molecules are allowed to collide. This drew the objection from Loschmidt that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism: something must be wrong (Loschmidt's paradox). The resolution (1895) of this paradox is that the velocities of two particles after a collision are no longer truly uncorrelated. By asserting that it was acceptable to ignore these correlations in the population at times after the initial time, Boltzmann had introduced an element of time asymmetry through the formalism of his calculation.[ citation needed ]

Though the Stosszahlansatz is usually understood as a physically grounded hypothesis, it was recently highlighted that it could also be interpreted as a heuristic hypothesis. This interpretation allows using the principle of maximum entropy in order to generalize the ansatz to higher-order distribution functions. [7]

Related Research Articles

<span class="mw-page-title-main">Boltzmann distribution</span> Probability distribution of energy states of a system

In statistical mechanics and mathematics, a Boltzmann distribution is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:

<span class="mw-page-title-main">Brownian motion</span> Random motion of particles suspended in a fluid

Brownian motion, or pedesis, is the random motion of particles suspended in a medium.

<span class="mw-page-title-main">Maxwell–Boltzmann distribution</span> Specific probability distribution function, important in physics

In physics, the Maxwell–Boltzmann distribution, or Maxwell(ian) distribution, is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Timeline of thermodynamics</span>

A timeline of events in the history of thermodynamics.

<span class="mw-page-title-main">Kinetic theory of gases</span> Historical physical model of gases

The kinetic theory of gases is a simple, historically significant classical model of the thermodynamic behavior of gases, with which many principal concepts of thermodynamics were established. The model describes a gas as a large number of identical submicroscopic particles, all of which are in constant, rapid, random motion. Their size is assumed to be much smaller than the average distance between the particles. The particles undergo random elastic collisions between themselves and with the enclosing walls of the container. The basic version of the model describes the ideal gas, and considers no other interactions between the particles.

<span class="mw-page-title-main">Ergodic hypothesis</span> Statistical mechanics hypothesis that all microstates are equiprobable for a given energy

In physics and thermodynamics, the ergodic hypothesis says that, over long periods of time, the time spent by a system in some region of the phase space of microstates with the same energy is proportional to the volume of this region, i.e., that all accessible microstates are equiprobable over a long period of time.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Equipartition theorem</span> Theorem in classical statistical mechanics

In classical statistical mechanics, the equipartition theorem relates the temperature of a system to its average energies. The equipartition theorem is also known as the law of equipartition, equipartition of energy, or simply equipartition. The original idea of equipartition was that, in thermal equilibrium, energy is shared equally among all of its various forms; for example, the average kinetic energy per degree of freedom in translational motion of a molecule should equal that in rotational motion.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

<span class="mw-page-title-main">Johann Josef Loschmidt</span> Austrian scientist (1821‐1895)

Johann Josef Loschmidt, who referred to himself mostly as Josef Loschmidt, was a notable Austrian scientist who performed ground-breaking work in chemistry, physics, and crystal forms.

Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.

<span class="mw-page-title-main">Boltzmann equation</span> Equation of statistical mechanics

The Boltzmann equation or Boltzmann transport equation (BTE) describes the statistical behaviour of a thermodynamic system not in a state of equilibrium, devised by Ludwig Boltzmann in 1872. The classic example of such a system is a fluid with temperature gradients in space causing heat to flow from hotter regions to colder ones, by the random but biased transport of the particles making up that fluid. In the modern literature the term Boltzmann equation is often used in a more general sense, referring to any kinetic equation that describes the change of a macroscopic quantity in a thermodynamic system, such as energy, charge or particle number.

The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.

<span class="mw-page-title-main">History of thermodynamics</span>

The history of thermodynamics is a fundamental strand in the history of physics, the history of chemistry, and the history of science in general. Owing to the relevance of thermodynamics in much of science and technology, its history is finely woven with the developments of classical mechanics, quantum mechanics, magnetism, and chemical kinetics, to more distant applied fields such as meteorology, information theory, and biology (physiology), and to technological developments such as the steam engine, internal combustion engine, cryogenics and electricity generation. The development of thermodynamics both drove and was driven by atomic theory. It also, albeit in a subtle manner, motivated new directions in probability and statistics; see, for example, the timeline of thermodynamics.

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

<span class="mw-page-title-main">Entropy (order and disorder)</span> Interpretation of entropy as the change in arrangement of a systems particles

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

<span class="mw-page-title-main">Temperature</span> Physical quantity that expresses hot and cold

Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measured with a thermometer.

Energy dissipation and entropy production extremal principles are ideas developed within non-equilibrium thermodynamics that attempt to predict the likely steady states and dynamical structures that a physical system might show. The search for extremum principles for non-equilibrium thermodynamics follows their successful use in other branches of physics. According to Kondepudi (2008), and to Grandy (2008), there is no general rule that provides an extremum principle that governs the evolution of a far-from-equilibrium system to a steady state. According to Glansdorff and Prigogine, irreversible processes usually are not governed by global extremal principles because description of their evolution requires differential equations which are not self-adjoint, but local extremal principles can be used for local solutions. Lebon Jou and Casas-Vásquez (2008) state that "In non-equilibrium ... it is generally not possible to construct thermodynamic potentials depending on the whole set of variables". Šilhavý (1997) offers the opinion that "... the extremum principles of thermodynamics ... do not have any counterpart for [non-equilibrium] steady states ." It follows that any general extremal principle for a non-equilibrium problem will need to refer in some detail to the constraints that are specific for the structure of the system considered in the problem.

References

  1. Ehrenfest, Paul; Ehrenfest, Tatiana (2002). The Conceptual Foundations of the Statistical Approach in Mechanics. Courier Corporation. ISBN   9780486495040.
  2. Brown, Harvey R.; Myrvold, Wayne (2008-09-08). "Boltzmann's H-theorem, its limitations, and the birth of (fully) statistical mechanics". arXiv: 0809.1304 [physics.hist-ph].
  3. Maxwell, J. C. (1867). "On the Dynamical Theory of Gases". Philosophical Transactions of the Royal Society of London. 157: 49–88. doi:10.1098/rstl.1867.0004. S2CID   96568430.
  4. See:
  5. Gyenis, Balazs (2017). "Maxwell and the normal distribution: A colored story of probability, independence, and tendency towards equilibrium". Studies in History and Philosophy of Modern Physics. 57: 53–65. arXiv: 1702.01411 . Bibcode:2017SHPMP..57...53G. doi:10.1016/j.shpsb.2017.01.001. S2CID   38272381.
  6. L. Boltzmann, "Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen." Sitzungsberichte Akademie der Wissenschaften 66 (1872): 275-370.
    English translation: Boltzmann, L. (2003). "Further Studies on the Thermal Equilibrium of Gas Molecules". The Kinetic Theory of Gases. History of Modern Physical Sciences. Vol. 1. pp. 262–349. Bibcode:2003HMPS....1..262B. doi:10.1142/9781848161337_0015. ISBN   978-1-86094-347-8.
  7. Chliamovitch, G.; Malaspinas, O.; Chopard, B. (2017). "Kinetic theory beyond the Stosszahlansatz". Entropy. 19 (8): 381. Bibcode:2017Entrp..19..381C. doi: 10.3390/e19080381 .