History of entropy

Last updated

The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

Contents

In the early 1850s, Rudolf Clausius set forth the concept of the thermodynamic system and posited the argument that in any irreversible process a small amount of heat energy δQ is incrementally dissipated across the system boundary. Clausius continued to develop his ideas of lost energy, and coined the term entropy.

Since the mid-20th century the concept of entropy has found application in the field of information theory, describing an analogous loss of data in information transmission systems.

Classical thermodynamic views

In 1803, mathematician Lazare Carnot published a work entitled Fundamental Principles of Equilibrium and Movement. This work includes a discussion on the efficiency of fundamental machines, i.e. pulleys and inclined planes. Carnot saw through all the details of the mechanisms to develop a general discussion on the conservation of mechanical energy. Over the next three decades, Carnot's theorem was taken as a statement that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity, i.e. the useful work done. From this Carnot drew the inference that perpetual motion was impossible. This loss of moment of activity was the first-ever rudimentary statement of the second law of thermodynamics and the concept of 'transformation-energy' or entropy, i.e. energy lost to dissipation and friction. [1]

Carnot died in exile in 1823. During the following year his son Sadi Carnot, having graduated from the École Polytechnique training school for engineers, but now living on half-pay with his brother Hippolyte in a small apartment in Paris, wrote Reflections on the Motive Power of Fire . In this book, Sadi visualized an ideal engine in which any heat (i.e., caloric) converted into work, could be reinstated by reversing the motion of the cycle, a concept subsequently known as thermodynamic reversibility. Building on his father's work, Sadi postulated the concept that "some caloric is always lost" in the conversion into work, even in his idealized reversible heat engine, which excluded frictional losses and other losses due to the imperfections of any real machine. He also discovered that this idealized efficiency was dependent only on the temperatures of the heat reservoirs between which the engine was working, and not on the types of working fluids. Any real heat engine could not realize the Carnot cycle's reversibility, and was condemned to be even less efficient. This loss of usable caloric was a precursory form of the increase in entropy as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics.

1854 definition

Rudolf Clausius - originator of the concept of "entropy" Clausius.jpg
Rudolf Clausius - originator of the concept of "entropy"

In his 1854 memoir, Clausius first develops the concepts of interior work, i.e. that "which the atoms of the body exert upon each other", and exterior work, i.e. that "which arise from foreign influences [to] which the body may be exposed", which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three categories into which heat Q may be divided:

  1. Heat employed in increasing the heat actually existing in the body.
  2. Heat employed in producing the interior work.
  3. Heat employed in producing the exterior work.

Building on this logic, and following a mathematical presentation of the first fundamental theorem, Clausius then presented the first-ever mathematical formulation of entropy, although at this point in the development of his theories he called it "equivalence-value", perhaps referring to the concept of the mechanical equivalent of heat which was developing at the time rather than entropy, a term which was to come into use later. [2] He stated: [3]

the second fundamental theorem in the mechanical theory of heat may thus be enunciated:

If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat Q from work at the temperature T, has the equivalence-value:

and the passage of the quantity of heat Q from the temperature T1 to the temperature T2, has the equivalence-value:

wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.

In modern terminology, that is, the terminology introduced by Clausius himself in 1865, we think of this equivalence-value as "entropy", symbolized by S. Thus, using the above description, we can calculate the entropy change ΔS for the passage of the quantity of heat Q from the temperature T1, through the "working body" of fluid, which was typically a body of steam, to the temperature T2 as shown below:

Diagram of Sadi Carnot's heat engine, 1824 Entropy-diagram.png
Diagram of Sadi Carnot's heat engine, 1824

If we make the assignment:

Then, the entropy change or "equivalence-value" for this transformation is:

which equals:

and by factoring out Q, we have the following form, as was derived by Clausius:

1856 definition

In 1856, Clausius stated what he called the "second fundamental theorem in the mechanical theory of heat" in the following form:

where N is the "equivalence-value" of all uncompensated transformations involved in a cyclical process. This equivalence-value was a precursory formulation of entropy. [4]

1862 definition

In 1862, Clausius stated what he calls the "theorem respecting the equivalence-values of the transformations" or what is now known as the second law of thermodynamics, as such:

The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing.

Quantitatively, Clausius states the mathematical expression for this theorem is follows.

Let δQ be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and T the absolute temperature of the body at the moment of giving up this heat, then the equation:

must be true for every reversible cyclical process, and the relation:

must hold good for every cyclical process which is in any way possible.

This was an early formulation of the second law and one of the original forms of the concept of entropy.

1865 definition

In 1865, Clausius gave irreversible heat loss, or what he had previously been calling "equivalence-value", a name: [5] [6] [7]

I propose that S be taken from the Greek words, 'en-tropie' [intrinsic direction]. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate. [8]

Clausius did not specify why he chose the symbol "S" to represent entropy, and it is almost certainly untrue that Clausius chose "S" in honor of Sadi Carnot; the given names of scientists are rarely if ever used this way. [9]

Later developments

In 1876, physicist J. Willard Gibbs, building on the work of Clausius, Hermann von Helmholtz and others, proposed that the measurement of "available energy" ΔG in a thermodynamic system could be mathematically accounted for by subtracting the "energy loss" TΔS from total energy change of the system ΔH. These concepts were further developed by James Clerk Maxwell [1871] and Max Planck [1903].

Statistical thermodynamic views

In 1877, Ludwig Boltzmann developed a statistical mechanical evaluation of the entropy S, of a body in its own given macrostate of internal thermodynamic equilibrium. It may be written as:

where

kB denotes the Boltzmann constant and
Ω denotes the number of microstates consistent with the given equilibrium macrostate.

Boltzmann himself did not actually write this formula expressed with the named constant kB, which is due to Planck's reading of Boltzmann. [10]

Boltzmann saw entropy as a measure of statistical "mixedupness" or disorder. This concept was soon refined by J. Willard Gibbs, and is now regarded as one of the cornerstones of the theory of statistical mechanics.

Erwin Schrödinger made use of Boltzmann's work in his book What is Life? [11] to explain why living systems have far fewer replication errors than would be predicted from Statistical Thermodynamics. Schrödinger used the Boltzmann equation in a different form to show increase of entropy

where D is the number of possible energy states in the system that can be randomly filled with energy. He postulated a local decrease of entropy for living systems when (1/D) represents the number of states that are prevented from randomly distributing, such as occurs in replication of the genetic code.

Without this correction Schrödinger claimed that statistical thermodynamics would predict one thousand mutations per million replications, and ten mutations per hundred replications following the rule for square root of n, far more mutations than actually occur.

Schrödinger's separation of random and non-random energy states is one of the few explanations for why entropy could be low in the past, but continually increasing now. It has been proposed as an explanation of localized decrease of entropy [12] in radiant energy focusing in parabolic reflectors and during dark current in diodes, which would otherwise be in violation of Statistical Thermodynamics.

Information theory

An analog to thermodynamic entropy is information entropy. In 1948, while working at Bell Telephone Laboratories, electrical engineer Claude Shannon set out to mathematically quantify the statistical nature of "lost information" in phone-line signals. To do this, Shannon developed the very general concept of information entropy, a fundamental cornerstone of information theory. Although the story varies, initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1939, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann. During their discussions, regarding what Shannon should call the "measure of uncertainty" or attenuation in phone-line signals with reference to his new information theory, according to one source: [13]

My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

According to another source, when von Neumann asked him how he was getting on with his information theory, Shannon replied: [14]

The theory was in excellent shape, except that he needed a good name for "missing information". "Why don’t you call it entropy", von Neumann suggested. "In the first place, a mathematical development very much like yours already exists in Boltzmann's statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.

In 1948 Shannon published his seminal paper A Mathematical Theory of Communication , in which he devoted a section to what he calls Choice, Uncertainty, and Entropy. [15] In this section, Shannon introduces an H function of the following form:

where K is a positive constant. Shannon then states that "any quantity of this form, where K merely amounts to a choice of a unit of measurement, plays a central role in information theory as measures of information, choice, and uncertainty." Then, as an example of how this expression applies in a number of different fields, he references R.C. Tolman's 1938 Principles of Statistical Mechanics, stating that "the form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space ... H is then, for example, the H in Boltzmann's famous H theorem." As such, over the last fifty years, ever since this statement was made, people have been overlapping the two concepts or even stating that they are exactly the same.

Shannon's information entropy is a much more general concept than statistical thermodynamic entropy. Information entropy is present whenever there are unknown quantities that can be described only by a probability distribution. In a series of papers by E. T. Jaynes starting in 1957, [16] [17] the statistical thermodynamic entropy can be seen as just a particular application of Shannon's information entropy to the probabilities of particular microstates of a system occurring in order to produce a particular macrostate.

The term entropy is often used in popular language to denote a variety of unrelated phenomena. One example is the concept of corporate entropy as put forward somewhat humorously by authors Tom DeMarco and Timothy Lister in their 1987 classic publication Peopleware, a book on growing and managing productive teams and successful software projects. Here, they view energy waste as red tape and business team inefficiency as a form of entropy, i.e. energy lost to waste. This concept has caught on and is now common jargon in business schools.

In another example, entropy is the central theme in Isaac Asimov's short story The Last Question (first copyrighted in 1956). The story plays with the idea that the most important question is how to stop the increase of entropy.

Terminology overlap

When necessary, to disambiguate between the statistical thermodynamic concept of entropy, and entropy-like formulae put forward by different researchers, the statistical thermodynamic entropy is most properly referred to as the Gibbs entropy . The terms Boltzmann–Gibbs entropy or BG entropy, and Boltzmann–Gibbs–Shannon entropy or BGS entropy are also seen in the literature.

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Thermodynamics</span> Physics of heat, work, and temperature

Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of thermodynamics which convey a quantitative description using measurable macroscopic physical quantities, but may be explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering, especially physical chemistry, biochemistry, chemical engineering and mechanical engineering, but also in other complex fields such as meteorology.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

<span class="mw-page-title-main">Third law of thermodynamics</span> Law of physics

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

<span class="mw-page-title-main">Thermodynamic equations</span> Equations in thermodynamics

Thermodynamics is expressed by a mathematical framework of thermodynamic equations which relate various thermodynamic quantities and physical properties measured in a laboratory or production process. Thermodynamics is based on a fundamental set of postulates, that became the laws of thermodynamics.

In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

<span class="mw-page-title-main">Clausius theorem</span> Version of the second law of thermodynamics

The Clausius theorem (1855), also known as the Clausius inequality, states that for a thermodynamic system exchanging heat with external thermal reservoirs and undergoing a thermodynamic cycle, the following inequality holds.

In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

In the history of thermodynamics, disgregation is an early formulation of the concept of entropy. It was defined in 1862 by Rudolf Clausius as the magnitude of the degree in which the molecules of a body are separated from each other. Disgregation was the stepping stone for Clausius to create the mathematical expression for the Second Law of Thermodynamics. Clausius modeled the concept on certain passages in French physicist Sadi Carnot's 1824 paper On the Motive Power of Fire which characterized the transformations of working substances of an engine cycle, namely "mode of aggregation". The concept was later extended by Clausius in 1865 in the formulation of entropy, and in Ludwig Boltzmann's 1870s developments including the diversities of the motions of the microscopic constituents of matter, described in terms of order and disorder. In 1949, Edward Armand Guggenheim developed the concept of energy dispersal. The terms disgregation and dispersal are near in meaning.

<span class="mw-page-title-main">Table of thermodynamic equations</span> Thermodynamics

Common thermodynamic equations and quantities in thermodynamics, using mathematical notation, are as follows:

<span class="mw-page-title-main">Heat</span> Type of energy transfer

In thermodynamics, heat is the thermal energy transferred between systems due to a temperature difference. In colloquial use, heat sometimes refers to thermal energy itself. Thermal energy is the kinetic energy of vibrating and colliding atoms in a substance.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that expresses quantitatively the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the kinetic energy of the vibrating and colliding atoms making up a substance.

<span class="mw-page-title-main">Entropy production</span> Development of entropy in a thermodynamic system

Entropy production is the amount of entropy which is produced during heat process to evaluate the efficiency of the process.

References

  1. Mendoza, E. (1988). Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius. New York: Dover Publications. ISBN   0-486-44641-7.
  2. Mechanical Theory of Heat, by Rudolf Clausius, 1850-1865
  3. Published in Poggendoff's Annalen, December 1854, vol. xciii. p. 481; translated in the Journal de Mathematiques, vol. xx. Paris, 1855, and in the Philosophical Magazine, August 1856, s. 4. vol. xii, p. 81
  4. Clausius, Rudolf. (1856). "On the Application of the Mechanical theory of Heat to the Steam-Engine." as found in: Clausius, R. (1865). The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies. London: John van Voorst, 1 Paternoster Row. MDCCCLXVII.
  5. Laidler, Keith J. (1995). The Physical World of Chemistry. Oxford University Press. pp. 104–105. ISBN   0-19-855919-4.
  6. OED, Second Edition, 1989, "Clausius (Pogg. Ann. CXXV. 390), assuming (unhistorically) the etymological sense of energy to be ‘work-contents’ (werk-inhalt), devised the term entropy as a corresponding designation for the ‘transformation-contents’ (Verwandlungsinhalt) of a system"
  7. Baierlein, Ralph (December 1992). "How entropy got its name". American Journal of Physics. 60 (12): 1151. Bibcode:1992AmJPh..60.1151B. doi:10.1119/1.16966.
  8. Clausius, Rudolf (1865). "Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie (Vorgetragen in der naturforsch. Gesellschaft zu Zürich den 24. April 1865)". Annalen der Physik und Chemie. 125 (7): 353–400. Bibcode:1865AnP...201..353C. doi:10.1002/andp.18652010702. "Sucht man für S einen bezeichnenden Namen, so könnte man, ähnlich wie von der Gröſse U gesagt ist, sie sey der Wärme- und Werkinhalt des Körpers, von der Gröſse S sagen, sie sey der Verwandlungsinhalt des Körpers. Da ich es aber für besser halte, die Namen derartiger für die Wissenschaft wichtiger Gröſsen aus den alten Sprachen zu entnehmen, damit sie unverändert in allen neuen Sprachen angewandt werden können, so schlage ich vor, die Gröſse S nach dem griechischen Worte ἡ τροπή, die Verwandlung, die Entropie des Körpers zu nennen. Das Wort Entropie habei ich absichtlich dem Worte Energie möglichst ähnlich gebildet, denn die beiden Gröſsen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, daſs eine gewisse Gleichartigkeit in der Benennung mir zweckmäſsig zu seyn scheint." (p. 390).
  9. Girolami, G. S. (2020). "A Brief History of Thermodynamics, As Illustrated by Books and People". J. Chem. Eng. Data. 65 (2): 298–311. doi:10.1021/acs.jced.9b00515. S2CID   203146340.
  10. Partington, J.R. (1949), An Advanced Treatise on Physical Chemistry, vol. 1, Fundamental Principles, The Properties of Gases, London: Longmans, Green and Co., p. 300
  11. Schrödinger, Erwin (2004). What is Life? (11th reprinting ed.). Cambridge: Canto. pp. 72–73. ISBN   0-521-42708-8.
  12. "Random and Non Random States". 27 August 2014.
  13. M. Tribus, E.C. McIrvine, "Energy and information", Scientific American, 224 (September 1971).
  14. Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN   981-238-400-6.
  15. C.E. Shannon, "A Mathematical Theory of Communication", Bell System Technical Journal , vol. 27, pp. 379-423, 623-656, July, October, 1948, Eprint Archived 1998-01-31 at the Wayback Machine , PDF
  16. E. T. Jaynes (1957) Information theory and statistical mechanics, Physical Review106:620
  17. E. T. Jaynes (1957) Information theory and statistical mechanics II, Physical Review108:171