Entropy (energy dispersal)

Last updated

In thermodynamics, the interpretation of entropy as a measure of energy dispersal has been exercised against the background of the traditional view, introduced by Ludwig Boltzmann, of entropy as a quantitative measure of disorder. The energy dispersal approach avoids the ambiguous term 'disorder'. An early advocate of the energy dispersal conception was Edward A. Guggenheim in 1949, using the word 'spread'. [1] [2]

Contents

In this alternative approach, entropy is a measure of energy dispersal or spread at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its temperature.

Some educators propose that the energy dispersal idea is easier to understand than the traditional approach. The concept has been used to facilitate teaching entropy to students beginning university chemistry and biology.

Comparisons with traditional approach

The term "entropy" has been in use from early in the history of classical thermodynamics, and with the development of statistical thermodynamics and quantum theory, entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels.

Such descriptions have tended to be used together with commonly used terms such as disorder and randomness, which are ambiguous, [3] [4] [5] and whose everyday meaning is the opposite of what they are intended to mean in thermodynamics. Not only does this situation cause confusion, but it also hampers the teaching of thermodynamics. Students were being asked to grasp meanings directly contradicting their normal usage, with equilibrium being equated to "perfect internal disorder" and the mixing of milk in coffee from apparent chaos to uniformity being described as a transition from an ordered state into a disordered state.[ citation needed ]

The description of entropy as the amount of "mixedupness" or "disorder," as well as the abstract nature of the statistical mechanics grounding this notion, can lead to confusion and considerable difficulty for those beginning the subject. [6] [7] Even though courses emphasised microstates and energy levels, most students could not get beyond simplistic notions of randomness or disorder. Many of those who learned by practising calculations did not understand well the intrinsic meanings of equations, and there was a need for qualitative explanations of thermodynamic relationships. [8] [9]

Arieh Ben-Naim recommends abandonment of the word entropy, rejecting both the 'dispersal' and the 'disorder' interpretations; instead he proposes the notion of "missing information" about microstates as considered in statistical mechanics, which he regards as commonsensical. [10]

Description

Increase of entropy in a thermodynamic process can be described in terms of "energy dispersal" and the "spreading of energy," while avoiding mention of "disorder" except when explaining misconceptions. All explanations of where and how energy is dispersing or spreading have been recast in terms of energy dispersal, so as to emphasise the underlying qualitative meaning. [6]

In this approach, the second law of thermodynamics is introduced as "Energy spontaneously disperses from being localized to becoming spread out if it is not hindered from doing so," often in the context of common experiences such as a rock falling, a hot frying pan cooling down, iron rusting, air leaving a punctured tyre and ice melting in a warm room. Entropy is then depicted as a sophisticated kind of "before and after" yardstick — measuring how much energy is spread out over time as a result of a process such as heating a system, or how widely spread out the energy is after something happens in comparison with its previous state, in a process such as gas expansion or fluids mixing (at a constant temperature). The equations are explored with reference to the common experiences, with emphasis that in chemistry the energy that entropy measures as dispersing is the internal energy of molecules.

The statistical interpretation is related to quantum mechanics in describing the way that energy is distributed (quantized) amongst molecules on specific energy levels, with all the energy of the macrostate always in only one microstate at one instant. Entropy is described as measuring the energy dispersal for a system by the number of accessible microstates, the number of different arrangements of all its energy at the next instant. Thus, an increase in entropy means a greater number of microstates for the final state than for the initial state, and hence more possible arrangements of a system's total energy at any one instant. Here, the greater 'dispersal of the total energy of a system' means the existence of many possibilities.[ citation needed ] [11]

Continuous movement and molecular collisions visualised as being like bouncing balls blown by air as used in a lottery can then lead on to showing the possibilities of many Boltzmann distributions and continually changing "distribution of the instant", and on to the idea that when the system changes, dynamic molecules will have a greater number of accessible microstates. In this approach, all everyday spontaneous physical happenings and chemical reactions are depicted as involving some type of energy flows from being localized or concentrated to becoming spread out to a larger space, always to a state with a greater number of microstates. [12]

This approach provides a good basis for understanding the conventional approach, except in very complex cases where the qualitative relation of energy dispersal to entropy change can be so inextricably obscured that it is moot. [12] Thus in situations such as the entropy of mixing when the two or more different substances being mixed are at the same temperature and pressure so there will be no net exchange of heat or work, the entropy increase will be due to the literal spreading out of the motional energy of each substance in the larger combined final volume. Each component's energetic molecules become more separated from one another than they would be in the pure state, when in the pure state they were colliding only with identical adjacent molecules, leading to an increase in its number of accessible microstates. [13]

Current adoption

Variants of the energy dispersal approach have been adopted in number of undergraduate chemistry texts,[ citation needed ] mainly in the United States. One respected text states:

The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of 'disorder' and the 'dispersal' of matter and energy that are used widely to introduce the concept of entropy: a more 'disorderly' distribution of energy and matter corresponds to a greater number of micro-states associated with the same total energy. — Atkins & de Paula (2006) [14] :81

History

The concept of 'dissipation of energy' was used in Lord Kelvin's 1852 article "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy." [15] He distinguished between two types or "stores" of mechanical energy: "statical" and "dynamical." He discussed how these two types of energy can change from one form to the other during a thermodynamic transformation. When heat is created by any irreversible process (such as friction), or when heat is diffused by conduction, mechanical energy is dissipated, and it is impossible to restore the initial state. [16] [17]

Using the word 'spread', an early advocate of the energy dispersal concept was Edward Armand Guggenheim. [1] [2] In the mid-1950s, with the development of quantum theory, researchers began speaking about entropy changes in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels, such as by thereactants and products of a chemical reaction. [18]

In 1984, the Oxford physical chemist Peter Atkins, in a book The Second Law, written for laypersons, presented a nonmathematical interpretation of what he called the "infinitely incomprehensible entropy" in simple terms, describing the Second Law of thermodynamics as "energy tends to disperse". His analogies included an imaginary intelligent being called "Boltzmann's Demon," who runs around reorganizing and dispersing energy, in order to show how the W in Boltzmann's entropy formula relates to energy dispersion. This dispersion is transmitted via atomic vibrations and collisions. Atkins wrote: "each atom carries kinetic energy, and the spreading of the atoms spreads the energy…the Boltzmann equation therefore captures the aspect of dispersal: the dispersal of the entities that are carrying the energy." [19] :78,79

In 1997, John Wrigglesworth described spatial particle distributions as represented by distributions of energy states. According to the second law of thermodynamics, isolated systems will tend to redistribute the energy of the system into a more probable arrangement or a maximum probability energy distribution, i.e. from that of being concentrated to that of being spread out. By virtue of the First law of thermodynamics, the total energy does not change; instead, the energy tends to disperse over the space to which it has access. [20] In his 1999 Statistical Thermodynamics, M.C. Gupta defined entropy as a function that measures how energy disperses when a system changes from one state to another. [21] Other authors defining entropy in a way that embodies energy dispersal are Cecie Starr [22] and Andrew Scott. [23]

In a 1996 article, the physicist Harvey S. Leff set out what he called "the spreading and sharing of energy." [24] Another physicist, Daniel F. Styer, published an article in 2000 showing that "entropy as disorder" was inadequate. [25] In an article published in the 2002 Journal of Chemical Education, Frank L. Lambert argued that portraying entropy as "disorder" is confusing and should be abandoned. He has gone on to develop detailed resources for chemistry instructors, equating entropy increase as the spontaneous dispersal of energy, namely how much energy is spread out in a process, or how widely dispersed it becomes – at a specific temperature. [6] [26]

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

In thermodynamics, dissipation is the result of an irreversible process that affects a thermodynamic system. In a dissipative process, energy transforms from an initial form to a final form, where the capacity of the final form to do thermodynamic work is less than that of the initial form. For example, transfer of energy as heat is dissipative because it is a transfer of energy other than by thermodynamic work or by transfer of matter, and spreads previously concentrated energy. Following the second law of thermodynamics, in conduction and radiation from one body to another, the entropy varies with temperature, but never decreases in an isolated system.

<span class="mw-page-title-main">Third law of thermodynamics</span> Law of physics

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.

<span class="mw-page-title-main">Non-equilibrium thermodynamics</span> Branch of thermodynamics

Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

<span class="mw-page-title-main">Irreversible process</span> Process that cannot be undone

In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, local entropy can decrease over time, accompanied by a compensating entropy increase in the surroundings; examples include objects undergoing cooling, living systems, and the formation of typical crystals.

Frank L. Lambert was an American academic who was Professor Emeritus of Chemistry at Occidental College, Los Angeles. He is known for his advocacy of changing the definition of thermodynamic entropy as "disorder" in US general chemistry texts to its replacement by viewing entropy as a measure of energy dispersal. He died in December 2018 at the age of 100.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Entropy (order and disorder)</span> Interpretation of entropy as the change in arrangement of a systems particles

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

Research concerning the relationship between the thermodynamic quantity entropy and both the origin and evolution of life began around the turn of the 20th century. In 1910, American historian Henry Adams printed and distributed to university libraries and history professors the small volume A Letter to American Teachers of History proposing a theory of history based on the second law of thermodynamics and on the principle of entropy.

<span class="mw-page-title-main">Table of thermodynamic equations</span> Thermodynamics

Common thermodynamic equations and quantities in thermodynamics, using mathematical notation, are as follows:

<span class="mw-page-title-main">Lloyd Demetrius</span>

Lloyd A. Demetrius is an American mathematician and theoretical biologist at the Department of Organismic and Evolutionary biology, Harvard University. He is best known for the discovery of the concept evolutionary entropy, a statistical parameter that characterizes Darwinian fitness in models of evolutionary processes at various levels of biological organization – molecular, organismic and social. Evolutionary entropy, a generalization of the Gibbs-Boltzmann entropy in statistical thermodynamics, is the cornerstone of directionality theory, an analytical study of evolution by variation and selection. The theory has applications to: a) the development of aging and the evolution of longevity; b) the origin and progression of age related diseases such as cancer, and neurodegenerative disorders such as Alzheimer's disease and Parkinson's disease; c) the evolution of cooperation and the spread of inequality.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that expresses quantitatively the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the kinetic energy of the vibrating and colliding atoms making up a substance.

References

  1. 1 2 Dugdale, J.S. (1996). Entropy and its Physical Meaning, Taylor & Francis, London, ISBN   0-7484-0568-2, Dugdale cites only Guggenheim, on page 101.
  2. 1 2 Guggenheim, E.A. (1949), Statistical basis of thermodynamics, Research: A Journal of Science and its Applications, 2, Butterworths, London, pp. 450–454.
  3. Denbigh K. (1981). The Principles of Chemical Equilibrium: With Applications in Chemistry and Chemical Engineering. London: Cambridge University Press. pp. 55–56.
  4. Jaynes, E.T. (1989). Clearing up mysteries — the original goal, in Maximum Entropy and Bayesian Methods , J. Skilling, Editor, Kluwer Academic Publishers, Dordrecht, pp. 1–27, page 24.
  5. Grandy, Walter T. Jr. (2008). Entropy and the Time Evolution of Macroscopic Systems. Oxford University Press. pp. 55–58. ISBN   978-0-19-954617-6.
  6. 1 2 3 Lambert, Frank L. (2002). "Disorder--A Cracked Crutch for Supporting Entropy Discussions," Journal of Chemical Education 79: 187. Updated version here.
  7. Lambert, Frank L. (2011). "The Second Law of Thermodynamics (6)."
  8. Carson, E. M., and Watson, J. R., (Department of Educational and Professional Studies, King's College, London), 2002, "Undergraduate students' understandings of entropy and Gibbs Free energy," University Chemistry Education - 2002 Papers, Royal Society of Chemistry.
  9. Sozbilir, Mustafa, PhD studies: Turkey, A Study of Undergraduates' Understandings of Key Chemical Ideas in Thermodynamics, Ph.D. Thesis, Department of Educational Studies, The University of York, 2001.
  10. Review of "Entropy and the second law: interpretation and misss-interpretationsss" in Chemistry World
  11. Lambert, Frank L. (2005). The Molecular Basis for Understanding Simple Entropy Change
  12. 1 2 Lambert, Frank L. (2005). Entropy Is Simple, Qualitatively
  13. Lambert, Frank L. (2005). Notes for a "Conversation About Entropy": a brief discussion of both thermodynamic and "configurational" ("positional") entropy in chemistry.
  14. Atkins, Peter; de Paula, Julio (2006). Physical Chemistry (8th ed.). Oxford University Press. ISBN   0-19-870072-5.
  15. Jensen, William. (2004). "Entropy and Constraint of Motion." Journal of Chemical Education (81) 693, May
  16. Thomson, William (1852). "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy." Proceedings of the Royal Society of Edinburg, April 19.
  17. Thomson, William (1874). "Kinetic Theory of the Dissipation of Energy", Nature IX: 441-44. (April 9).
  18. Denbigh, Kenneth (1981). The Principles of Chemical Equilibrium, 4th Ed. Cambridge University Press. ISBN   0-521-28150-4.
  19. Atkins, Peter (1984). The Second Law . Scientific American Library. ISBN   0-7167-5004-X.
  20. Wrigglesworth, John (1997). Energy and Life (Modules in Life Sciences). CRC. ISBN   0-7484-0433-3. (see excerpt)
  21. Gupta, M.C. (1999). Statistical Thermodynamics. New Age Publishers. ISBN   81-224-1066-9. (see excerpt)
  22. Starr, Cecie; Taggart, R. (1992). Biology - the Unity and Diversity of Life. Wadsworth Publishing Co. ISBN   0-534-16566-4.
  23. Scott, Andrew (2001). 101 Key ideas in Chemistry. Teach Yourself Books. ISBN   0-07-139665-9.
  24. Leff, H. S., 1996, "Thermodynamic entropy: The spreading and sharing of energy," Am. J. Phys. 64: 1261-71.
  25. Styer D. F., 2000, Am. J. Phys. 68: 1090-96.
  26. Lambert, Frank L. (2006). A Student's Approach to the Second Law and Entropy

Further reading

Texts using the energy dispersal approach