The principle of microscopic reversibility in physics and chemistry is twofold:
Corresponding to every individual process there is a reverse process, and in a state of equilibrium the average rate of every process is equal to the average rate of its reverse process. [1]
The idea of microscopic reversibility was born together with physical kinetics. In 1872, Ludwig Boltzmann represented kinetics of gases as statistical ensemble of elementary collisions. [2] Equations of mechanics are reversible in time, hence, the reverse collisions obey the same laws. This reversibility of collisions is the first example of microreversibility. According to Boltzmann, this microreversibility implies the principle of detailed balance for collisions: at the equilibrium ensemble each collision is equilibrated by its reverse collision. [2] These ideas of Boltzmann were analyzed in detail and generalized by Richard C. Tolman. [3]
In chemistry, J. H. van't Hoff (1884) [4] came up with the idea that equilibrium has dynamical nature and is a result of the balance between the forward and backward reaction rates. He did not study reaction mechanisms with many elementary reactions and could not formulate the principle of detailed balance for complex reactions. In 1901, Rudolf Wegscheider introduced the principle of detailed balance for complex chemical reactions. [5] He found that for a complex reaction the principle of detailed balance implies important and non-trivial relations between reaction rate constants for different reactions. In particular, he demonstrated that the irreversible cycles of reaction are impossible and for the reversible cycles the product of constants of the forward reactions (in the "clockwise" direction) is equal to the product of constants of the reverse reactions (in the "anticlockwise" direction). Lars Onsager (1931) used these relations in his well-known work, [6] without direct citation but with the following remark:
"Here, however, the chemists are accustomed to impose a very interesting additional restriction, namely: when the equilibrium is reached each individual reaction must balance itself. They require that the transition must take place just as frequently as the reverse transition etc."
The quantum theory of emission and absorption developed by Albert Einstein (1916, 1917) [7] gives an example of application of the microreversibility and detailed balance to development of a new branch of kinetic theory.
Sometimes, the principle of detailed balance is formulated in the narrow sense, for chemical reactions only [8] but in the history of physics it has the broader use: it was invented for collisions, used for emission and absorption of quanta, for transport processes [9] and for many other phenomena.
In its modern form, the principle of microreversibility was published by Lewis (1925). [1] In the classical textbooks [3] [10] full theory and many examples of applications are presented.
The Newton and the Schrödinger equations in the absence of the macroscopic magnetic fields and in the inertial frame of reference are T-invariant: if X(t) is a solution then X(-t) is also a solution (here X is the vector of all dynamic variables, including all the coordinates of particles for the Newton equations and the wave function in the configuration space for the Schrödinger equation).
There are two sources of the violation of this rule:
In physics and chemistry, there are two main macroscopic consequences of the time-reversibility of microscopic dynamics: the principle of detailed balance and the Onsager reciprocal relations.
The statistical description of the macroscopic process as an ensemble of the elementary indivisible events (collisions) was invented by L. Boltzmann and formalised in the Boltzmann equation. He discovered that the time-reversibility of the Newtonian dynamics leads to the detailed balance for collision: in equilibrium collisions are equilibrated by their reverse collisions. This principle allowed Boltzmann to deduce simple and nice formula for entropy production and prove his famous H-theorem. [2] In this way, microscopic reversibility was used to prove macroscopic irreversibility and convergence of ensembles of molecules to their thermodynamic equilibria.
Another macroscopic consequence of microscopic reversibility is the symmetry of kinetic coefficients, the so-called reciprocal relations. The reciprocal relations were discovered in the 19th century by Thomson and Helmholtz for some phenomena but the general theory was proposed by Lars Onsager in 1931. [6] He found also the connection between the reciprocal relations and detailed balance. For the equations of the law of mass action the reciprocal relations appear in the linear approximation near equilibrium as a consequence of the detailed balance conditions. According to the reciprocal relations, the damped oscillations in homogeneous closed systems near thermodynamic equilibria are impossible because the spectrum of symmetric operators is real. Therefore, the relaxation to equilibrium in such a system is monotone if it is sufficiently close to the equilibrium.
Entropy is a scientific concept as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.
In physical chemistry, the Arrhenius equation is a formula for the temperature dependence of reaction rates. The equation was proposed by Svante Arrhenius in 1889, based on the work of Dutch chemist Jacobus Henricus van 't Hoff who had noted in 1884 that the van 't Hoff equation for the temperature dependence of equilibrium constants suggests such a formula for the rates of both forward and reverse reactions. This equation has a vast and important application in determining rate of chemical reactions and for calculation of energy of activation. Arrhenius provided a physical justification and interpretation for the formula. Currently, it is best seen as an empirical relationship. It can be used to model the temperature variation of diffusion coefficients, population of crystal vacancies, creep rates, and many other thermally-induced processes/reactions. The Eyring equation, developed in 1935, also expresses the relationship between rate and energy.
A timeline of events in the history of thermodynamics.
The second law of thermodynamics establishes the concept of entropy as a physical property of a thermodynamic system. Entropy change predicts the direction of spontaneous processes, and determines whether they are irreversible or impossible despite obeying the requirement of conservation of energy as expressed in the first law of thermodynamics. The second law may be formulated by the observation that the entropy of isolated systems left to spontaneous evolution cannot decrease, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest at the given internal energy. If all processes in the system are reversible, the entropy is constant. An increase in entropy accounts for the irreversibility of natural processes, often referred to in the concept of the arrow of time.
T-symmetry or time reversal symmetry is the theoretical symmetry of physical laws under the transformation of time reversal,
Scientific laws or laws of science are statements, based on repeated experiments or observations, that describe or predict a range of natural phenomena. The term law has diverse usage in many cases across all fields of natural science. Laws are developed from data and can be further developed through mathematics; in all cases they are directly or indirectly based on empirical evidence. It is generally understood that they implicitly reflect, though they do not explicitly assert, causal relationships fundamental to reality, and are discovered rather than invented.
In chemistry, the law of mass action is the proposition that the rate of the chemical reaction is directly proportional to the product of the activities or concentrations of the reactants. It explains and predicts behaviors of solutions in dynamic equilibrium. Specifically, it implies that for a chemical reaction mixture that is in equilibrium, the ratio between the concentration of reactants and products is constant.
The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium will increase or decrease over a given amount of time. While the second law of thermodynamics predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists.
Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions.
In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics.
Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.
The laws of thermodynamics define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general, and are applicable in other natural sciences.
In physics, chemistry and related fields, master equations are used to describe the time evolution of a system that can be modelled as being in a probabilistic combination of states at any given time and the switching between states is determined by a transition rate matrix. The equations are a set of differential equations – over time – of the probabilities that the system occupies each of the different states.
The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.
The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of a large ensembles of microstates that constitute thermodynamic systems.
Energy dissipation and entropy production extremal principles are ideas developed within non-equilibrium thermodynamics that attempt to predict the likely steady states and dynamical structures that a physical system might show. The search for extremum principles for non-equilibrium thermodynamics follows their successful use in other branches of physics. According to Kondepudi (2008), and to Grandy (2008), there is no general rule that provides an extremum principle that governs the evolution of a far-from-equilibrium system to a steady state. According to Glansdorff and Prigogine, irreversible processes usually are not governed by global extremal principles because description of their evolution requires differential equations which are not self-adjoint, but local extremal principles can be used for local solutions. Lebon Jou and Casas-Vásquez (2008) state that "In non-equilibrium ... it is generally not possible to construct thermodynamic potentials depending on the whole set of variables". Šilhavý (1997) offers the opinion that "... the extremum principles of thermodynamics ... do not have any counterpart for [non-equilibrium] steady states ." It follows that any general extremal principle for a non-equilibrium problem will need to refer in some detail to the constraints that are specific for the structure of the system considered in the problem.
Grigoriy Yablonsky is an expert in the area of chemical kinetics and chemical engineering, particularly in catalytic technology of complete and selective oxidation, which is one of the main driving forces of sustainable development.