In the kinetic theory of gases in physics, the molecular chaos hypothesis (also called Stosszahlansatz in the writings of Paul and Tatiana Ehrenfest [1] [2] ) is the assumption that the velocities of colliding particles are uncorrelated, and independent of position. This means the probability that a pair of particles with given velocities will collide can be calculated by considering each particle separately and ignoring any correlation between the probability for finding one particle with velocity v and probability for finding another velocity v' in a small region δr. James Clerk Maxwell introduced this approximation in 1867 [3] although its origins can be traced back to his first work on the kinetic theory in 1860. [4] [5]
The assumption of molecular chaos is the key ingredient that allows proceeding from the BBGKY hierarchy to Boltzmann's equation, by reducing the 2-particle distribution function showing up in the collision term to a product of 1-particle distributions. This in turn leads to Boltzmann's H-theorem of 1872, [6] which attempted to use kinetic theory to show that the entropy of a gas prepared in a state of less than complete disorder must inevitably increase, as the gas molecules are allowed to collide. This drew the objection from Loschmidt that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism: something must be wrong (Loschmidt's paradox). The resolution (1895) of this paradox is that the velocities of two particles after a collision are no longer truly uncorrelated. By asserting that it was acceptable to ignore these correlations in the population at times after the initial time, Boltzmann had introduced an element of time asymmetry through the formalism of his calculation.[ citation needed ]
Though the Stosszahlansatz is usually understood as a physically grounded hypothesis, it was recently highlighted that it could also be interpreted as a heuristic hypothesis. This interpretation allows using the principle of maximum entropy in order to generalize the ansatz to higher-order distribution functions. [7]
Brownian motion is the random motion of particles suspended in a medium.
In physics, the Maxwell–Boltzmann distribution, or Maxwell(ian) distribution, is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, neuroscience, computer science, information theory and sociology. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.
A timeline of events in the history of thermodynamics.
The kinetic theory of gases is a simple classical model of the thermodynamic behavior of gases. It treats a gas as composed of numerous particles, too small to see with a microscope, which are constantly in random motion. Their collisions with each other and with the walls of their container are used to explain physical properties of the gas—for example, the relationship between its temperature, pressure, and volume. The particles are now known to be the atoms or molecules of the gas.
In physics, mean free path is the average distance over which a moving particle travels before substantially changing its direction or energy, typically as a result of one or more successive collisions with other particles.
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
In classical statistical mechanics, the equipartition theorem relates the temperature of a system to its average energies. The equipartition theorem is also known as the law of equipartition, equipartition of energy, or simply equipartition. The original idea of equipartition was that, in thermal equilibrium, energy is shared equally among all of its various forms; for example, the average kinetic energy per degree of freedom in translational motion of a molecule should equal that in rotational motion.
Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of the statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.
Johann Josef Loschmidt, who mostly called himself Josef Loschmidt, was an Austrian scientist who performed ground-breaking work in chemistry, physics, and crystal forms.
In physics, Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox, or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.
The Boltzmann equation or Boltzmann transport equation (BTE) describes the statistical behaviour of a thermodynamic system not in a state of equilibrium; it was devised by Ludwig Boltzmann in 1872. The classic example of such a system is a fluid with temperature gradients in space causing heat to flow from hotter regions to colder ones, by the random but biased transport of the particles making up that fluid. In the modern literature the term Boltzmann equation is often used in a more general sense, referring to any kinetic equation that describes the change of a macroscopic quantity in a thermodynamic system, such as energy, charge or particle number.
In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.
The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.
The history of thermodynamics is a fundamental strand in the history of physics, the history of chemistry, and the history of science in general. Owing in the relevance of thermodynamics in much of science and technology, its history is finely woven with the developments of classical mechanics, quantum mechanics, magnetism, and chemical kinetics, to more distant applied fields such as meteorology, information theory, and biology (physiology), and to technological developments such as the steam engine, internal combustion engine, cryogenics and electricity generation. The development of thermodynamics both drove and was driven by atomic theory. It also, albeit in a subtle manner, motivated new directions in probability and statistics; see, for example, the timeline of thermodynamics.
The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.
In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:
Plasma modeling refers to solving equations of motion that describe the state of a plasma. It is generally coupled with Maxwell's equations for electromagnetic fields or Poisson's equation for electrostatic fields. There are several main types of plasma models: single particle, kinetic, fluid, hybrid kinetic/fluid, gyrokinetic and as system of many particles.
In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:
Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.