Entropy (order and disorder)

Last updated
Boltzmann's molecules (1896) shown at a "rest position" in a solid Boltzmann-molecules.svg
Boltzmann's molecules (1896) shown at a "rest position" in a solid

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression: [1]

Contents

where Q = motional energy ("heat") that is transferred reversibly to the system from the surroundings and T = the absolute temperature at which the transfer occurs.

In the years to follow, Ludwig Boltzmann translated these 'alterations of arrangement' into a probabilistic view of order and disorder in gas-phase molecular systems. In the context of entropy, "perfect internal disorder" has often been regarded as describing thermodynamic equilibrium, but since the thermodynamic concept is so far from everyday thinking, the use of the term in physics and chemistry has caused much confusion and misunderstanding.

In recent years, to interpret the concept of entropy, by further describing the 'alterations of arrangement', there has been a shift away from the words 'order' and 'disorder', to words such as 'spread' and 'dispersal'.

History

This "molecular ordering" entropy perspective traces its origins to molecular movement interpretations developed by Rudolf Clausius in the 1850s, particularly with his 1862 visual conception of molecular disgregation. Similarly, in 1859, after reading a paper on the diffusion of molecules by Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics. [2]

In 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell's paper and was so inspired by it that he spent much of his long and distinguished life developing the subject further. Later, Boltzmann, in efforts to develop a kinetic theory for the behavior of a gas, applied the laws of probability to Maxwell's and Clausius' molecular interpretation of entropy so as to begin to interpret entropy in terms of order and disorder. Similarly, in 1882 Hermann von Helmholtz used the word "Unordnung" (disorder) to describe entropy. [3]

Overview

To highlight the fact that order and disorder are commonly understood to be measured in terms of entropy, below are current science encyclopedia and science dictionary definitions of entropy:

Entropy and disorder also have associations with equilibrium. [8] Technically, entropy, from this perspective, is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium—that is, to perfect internal disorder. [9] Likewise, the value of the entropy of a distribution of atoms and molecules in a thermodynamic system is a measure of the disorder in the arrangements of its particles. [10] In a stretched out piece of rubber, for example, the arrangement of the molecules of its structure has an "ordered" distribution and has zero entropy, while the "disordered" kinky distribution of the atoms and molecules in the rubber in the non-stretched state has positive entropy. Similarly, in a gas, the order is perfect and the measure of entropy of the system has its lowest value when all the molecules are in one place, whereas when more points are occupied the gas is all the more disorderly and the measure of the entropy of the system has its largest value. [10]

In systems ecology, as another example, the entropy of a collection of items comprising a system is defined as a measure of their disorder or equivalently the relative likelihood of the instantaneous configuration of the items. [11] Moreover, according to theoretical ecologist and chemical engineer Robert Ulanowicz, "that entropy might provide a quantification of the heretofore subjective notion of disorder has spawned innumerable scientific and philosophical narratives." [11] [12] In particular, many biologists have taken to speaking in terms of the entropy of an organism, or about its antonym negentropy, as a measure of the structural order within an organism. [11]

The mathematical basis with respect to the association entropy has with order and disorder began, essentially, with the famous Boltzmann formula, , which relates entropy S to the number of possible states W in which a system can be found. [13] As an example, consider a box that is divided into two sections. What is the probability that a certain number, or all of the particles, will be found in one section versus the other when the particles are randomly allocated to different places within the box? If you only have one particle, then that system of one particle can subsist in two states, one side of the box versus the other. If you have more than one particle, or define states as being further locational subdivisions of the box, the entropy is larger because the number of states is greater. The relationship between entropy, order, and disorder in the Boltzmann equation is so clear among physicists that according to the views of thermodynamic ecologists Sven Jorgensen and Yuri Svirezhev, "it is obvious that entropy is a measure of order or, most likely, disorder in the system." [13] In this direction, the second law of thermodynamics, as famously enunciated by Rudolf Clausius in 1865, states that:

The entropy of the universe tends to a maximum.

Thus, if entropy is associated with disorder and if the entropy of the universe is headed towards maximal entropy, then many are often puzzled as to the nature of the "ordering" process and operation of evolution in relation to Clausius' most famous version of the second law, which states that the universe is headed towards maximal "disorder". In the recent 2003 book SYNC – the Emerging Science of Spontaneous Order by Steven Strogatz, for example, we find "Scientists have often been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate toward a state of greater disorder, greater entropy. Yet all around us we see magnificent structures—galaxies, cells, ecosystems, human beings—that have all somehow managed to assemble themselves." [14]

The common argument used to explain this is that, locally, entropy can be lowered by external action, e.g. solar heating action, and that this applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, to growing crystals, and to living organisms. [9] This local increase in order is, however, only possible at the expense of an entropy increase in the surroundings; here more disorder must be created. [9] [15] The conditioner of this statement suffices that living systems are open systems in which both heat, mass, and or work may transfer into or out of the system. Unlike temperature, the putative entropy of a living system would drastically change if the organism were thermodynamically isolated. If an organism was in this type of "isolated" situation, its entropy would increase markedly as the once-living components of the organism decayed to an unrecognizable mass. [11]

Phase change

Owing to these early developments, the typical example of entropy change ΔS is that associated with phase change. In solids, for example, which are typically ordered on the molecular scale, usually have smaller entropy than liquids, and liquids have smaller entropy than gases and colder gases have smaller entropy than hotter gases. Moreover, according to the third law of thermodynamics, at absolute zero temperature, crystalline structures are approximated to have perfect "order" and zero entropy. This correlation occurs because the numbers of different microscopic quantum energy states available to an ordered system are usually much smaller than the number of states available to a system that appears to be disordered.

From his famous 1896 Lectures on Gas Theory, Boltzmann diagrams the structure of a solid body, as shown above, by postulating that each molecule in the body has a "rest position". According to Boltzmann, if it approaches a neighbor molecule it is repelled by it, but if it moves farther away there is an attraction. This, of course was a revolutionary perspective in its time; many, during these years, did not believe in the existence of either atoms or molecules (see: history of the molecule). [16] According to these early views, and others such as those developed by William Thomson, if energy in the form of heat is added to a solid, so to make it into a liquid or a gas, a common depiction is that the ordering of the atoms and molecules becomes more random and chaotic with an increase in temperature:

Solid-liquid-gas.svg

Thus, according to Boltzmann, owing to increases in thermal motion, whenever heat is added to a working substance, the rest position of molecules will be pushed apart, the body will expand, and this will create more molar-disordered distributions and arrangements of molecules. These disordered arrangements, subsequently, correlate, via probability arguments, to an increase in the measure of entropy. [17]

Entropy-driven order

Entropy has been historically, e.g. by Clausius and Helmholtz, associated with disorder. However, in common speech, order is used to describe organization, structural regularity, or form, like that found in a crystal compared with a gas. This commonplace notion of order is described quantitatively by Landau theory. In Landau theory, the development of order in the everyday sense coincides with the change in the value of a mathematical quantity, a so-called order parameter. An example of an order parameter for crystallization is "bond orientational order" describing the development of preferred directions (the crystallographic axes) in space. For many systems, phases with more structural (e.g. crystalline) order exhibit less entropy than fluid phases under the same thermodynamic conditions. In these cases, labeling phases as ordered or disordered according to the relative amount of entropy (per the Clausius/Helmholtz notion of order/disorder) or via the existence of structural regularity (per the Landau notion of order/disorder) produces matching labels.

However, there is a broad class [18] of systems that manifest entropy-driven order, in which phases with organization or structural regularity, e.g. crystals, have higher entropy than structurally disordered (e.g. fluid) phases under the same thermodynamic conditions. In these systems phases that would be labeled as disordered by virtue of their higher entropy (in the sense of Clausius or Helmholtz) are ordered in both the everyday sense and in Landau theory.

Under suitable thermodynamic conditions, entropy has been predicted or discovered to induce systems to form ordered liquid-crystals, crystals, and quasicrystals. [19] [20] [21] In many systems, directional entropic forces drive this behavior. More recently, it has been shown it is possible to precisely engineer particles for target ordered structures. [22]

Adiabatic demagnetization

In the quest for ultra-cold temperatures, a temperature lowering technique called adiabatic demagnetization is used, where atomic entropy considerations are utilized which can be described in order-disorder terms. [23] In this process, a sample of solid such as chrome-alum salt, whose molecules are equivalent to tiny magnets, is inside an insulated enclosure cooled to a low temperature, typically 2 or 4 kelvins, with a strong magnetic field being applied to the container using a powerful external magnet, so that the tiny molecular magnets are aligned forming a well-ordered "initial" state at that low temperature. This magnetic alignment means that the magnetic energy of each molecule is minimal. [24] The external magnetic field is then reduced, a removal that is considered to be closely reversible. Following this reduction, the atomic magnets then assume random less-ordered orientations, owing to thermal agitations, in the "final" state:

Entropy "order"/"disorder" considerations in the process of adiabatic demagnetization Adiabatic-demagnitization.svg
Entropy "order"/"disorder" considerations in the process of adiabatic demagnetization

The "disorder" and hence the entropy associated with the change in the atomic alignments has clearly increased. [23] In terms of energy flow, the movement from a magnetically aligned state requires energy from the thermal motion of the molecules, converting thermal energy into magnetic energy. [24] Yet, according to the second law of thermodynamics, because no heat can enter or leave the container, due to its adiabatic insulation, the system should exhibit no change in entropy, i.e. ΔS = 0. The increase in disorder, however, associated with the randomizing directions of the atomic magnets represents an entropy increase? To compensate for this, the disorder (entropy) associated with the temperature of the specimen must decrease by the same amount. [23] The temperature thus falls as a result of this process of thermal energy being converted into magnetic energy. If the magnetic field is then increased, the temperature rises and the magnetic salt has to be cooled again using a cold material such as liquid helium. [24]

Difficulties with the term "disorder"

In recent years the long-standing use of term "disorder" to discuss entropy has met with some criticism. [25] [26] [27] [28] [29] [30] Critics of the terminology state that entropy is not a measure of 'disorder' or 'chaos', but rather a measure of energy's diffusion or dispersal to more microstates. Shannon's use of the term 'entropy' in information theory refers to the most compressed, or least dispersed, amount of code needed to encompass the content of a signal. [31] [32] [33]

See also

Related Research Articles

Chemical thermodynamics is the study of the interrelation of heat and work with chemical reactions or with physical changes of state within the confines of the laws of thermodynamics. Chemical thermodynamics involves not only laboratory measurements of various thermodynamic properties, but also the application of mathematical methods to the study of chemical questions and the spontaneity of processes.

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Thermodynamics</span> Physics of heat, work, and temperature

Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of thermodynamics which convey a quantitative description using measurable macroscopic physical quantities, but may be explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering, especially physical chemistry, biochemistry, chemical engineering and mechanical engineering, but also in other complex fields such as meteorology.

<span class="mw-page-title-main">Thermodynamic free energy</span> State function whose change relates to the systems maximal work output

In thermodynamics, the thermodynamic free energy is one of the state functions of a thermodynamic system. The change in the free energy is the maximum amount of work that the system can perform in a process at constant temperature, and its sign indicates whether the process is thermodynamically favorable or forbidden. Since free energy usually contains potential energy, it is not absolute but depends on the choice of a zero point. Therefore, only relative free energy values, or changes in free energy, are physically meaningful.

<span class="mw-page-title-main">Timeline of thermodynamics</span>

A timeline of events in the history of thermodynamics.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

<span class="mw-page-title-main">Irreversible process</span> Process that cannot be undone

In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.

In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.

The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, local entropy can decrease over time, accompanied by a compensating entropy increase in the surroundings; examples include objects undergoing cooling, living systems, and the formation of typical crystals.

In thermodynamics, the interpretation of entropy as a measure of energy dispersal has been exercised against the background of the traditional view, introduced by Ludwig Boltzmann, of entropy as a quantitative measure of disorder. The energy dispersal approach avoids the ambiguous term 'disorder'. An early advocate of the energy dispersal conception was Edward A. Guggenheim in 1949, using the word 'spread'.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

Research concerning the relationship between the thermodynamic quantity entropy and both the origin and evolution of life began around the turn of the 20th century. In 1910, American historian Henry Adams printed and distributed to university libraries and history professors the small volume A Letter to American Teachers of History proposing a theory of history based on the second law of thermodynamics and on the principle of entropy.

<span class="mw-page-title-main">Boltzmann's entropy formula</span> Equation in statistical mechanics

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

<span class="mw-page-title-main">Lloyd Demetrius</span>

Lloyd A. Demetrius is an American mathematician and theoretical biologist at the Department of Organismic and Evolutionary biology, Harvard University. He is best known for the discovery of the concept evolutionary entropy, a statistical parameter that characterizes Darwinian fitness in models of evolutionary processes at various levels of biological organization – molecular, organismic and social. Evolutionary entropy, a generalization of the Gibbs-Boltzmann entropy in statistical thermodynamics, is the cornerstone of directionality theory, an analytical study of evolution by variation and selection. The theory has applications to: a) the development of aging and the evolution of longevity; b) the origin and progression of age related diseases such as cancer, and neurodegenerative disorders such as Alzheimer's disease and Parkinson's disease; c) the evolution of cooperation and the spread of inequality.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

References

  1. Mechanical Theory of Heat – Nine Memoirs on the development of concept of "Entropy" by Rudolf Clausius [18501865]
  2. Mahon, Basil (2003). The Man Who Changed Everything – the Life of James Clerk Maxwell. Hoboken, NJ: Wiley. ISBN   0-470-86171-1.
  3. Anderson, Greg (2005). Thermodynamics of Natural Systems. Cambridge University Press. ISBN   0-521-84772-9.
  4. Oxford Dictionary of Science, 2005
  5. Oxford Dictionary of Chemistry, 2004
  6. Barnes & Noble's Essential Dictionary of Science, 2004
  7. Gribbin's Encyclopedia of Particle Physics, 2000
  8. Landsberg, P.T. (1984). "Is Equilibrium always an Entropy Maximum?" J. Stat. Physics 35: 159–69.
  9. 1 2 3 Microsoft Encarta 2006. © 19932005 Microsoft Corporation. All rights reserved.
  10. 1 2 Greven, Andreas; Keller, Gerhard; Warnercke, Gerald (2003). Entropy – Princeton Series in Applied Mathematics. Princeton University Press. ISBN   0-691-11338-6.
  11. 1 2 3 4 Ulanowicz, Robert, E. (2000). Growth and Development – Ecosystems Phenomenology . toExcel Press. ISBN   0-595-00145-9.{{cite book}}: CS1 maint: multiple names: authors list (link)
  12. Kubat, L.; Zeman, J. (1975). Entropy and Information in Science and Philosophy. Elsevier.
  13. 1 2 Jorgensen, Sven E; Svirezhev, Yuri M (2004). Towards a Thermodynamic Theory for Ecological Systems. Elsevier. ISBN   0-08-044167-X.
  14. Strogatz, Steven (2003). the Emerging Science of Spontaneous Order . Theia. ISBN   0-7868-6844-9.
  15. Brooks, Daniel, R.; Wiley, E.O. (1988). Entropy as Evolution – Towards a Unified Theory of Biology. University of Chicago Press. ISBN   0-226-07574-5.{{cite book}}: CS1 maint: multiple names: authors list (link)
  16. Cercignani, Carlo (1998). Ludwig Boltzmann: The Man Who Trusted Atoms . Oxford University Press. ISBN   978-0-19-850154-1.
  17. Boltzmann, Ludwig (1896). Lectures on Gas Theory. Dover (reprint). ISBN   0-486-68455-5.
  18. van Anders, Greg; Klotsa, Daphne; Ahmed, N. Khalid; Engel, Michael; Glotzer, Sharon C. (2014). "Understanding shape entropy through local dense packing". Proc Natl Acad Sci USA. 111 (45): E4812–E4821. arXiv: 1309.1187 . Bibcode:2014PNAS..111E4812V. doi: 10.1073/pnas.1418159111 . PMC   4234574 . PMID   25344532.
  19. Onsager, Lars (1949). "The effects of shape on the interaction of colloidal particles". Annals of the New York Academy of Sciences. 51 (4): 627. Bibcode:1949NYASA..51..627O. doi:10.1111/j.1749-6632.1949.tb27296.x. S2CID   84562683.
  20. Haji-Akbari, Amir; Engel, Michael; Keys, Aaron S.; Zheng, Xiaoyu; Petschek, Rolfe G.; Palffy-Muhoray, Peter; Glotzer, Sharon C. (2009). "Disordered, quasicrystalline and crystalline phases of densely packed tetrahedra". Nature. 462 (7274): 773–777. arXiv: 1012.5138 . Bibcode:2009Natur.462..773H. doi:10.1038/nature08641. PMID   20010683. S2CID   4412674.
  21. Damasceno, Pablo F.; Engel, Michael; Glotzer, Sharon C. (2012). "Predictive Self-Assembly of Polyhedra into Complex Structures". Science. 337 (6093): 453–457. arXiv: 1202.2177 . Bibcode:2012Sci...337..453D. doi:10.1126/science.1220869. PMID   22837525. S2CID   7177740.
  22. Geng, Yina; van Anders, Greg; Dodd, Paul M.; Dshemuchadse, Julia; Glotzer, Sharon C. (2019). "Engineering Entropy for the Inverse Design of Colloidal Crystals from Hard Shapes". Science Advances. 5 (7): eeaw0514. arXiv: 1712.02471 . Bibcode:2019SciA....5..514G. doi:10.1126/sciadv.aaw0514. PMC   6611692 . PMID   31281885.
  23. 1 2 3 Halliday, David; Resnick, Robert (1988). Fundamentals of Physics, Extended 3rd ed . Wiley. ISBN   0-471-81995-6.
  24. 1 2 3 NASA – How does an Adiabatic Demagnetization Refrigerator Work ?
  25. Denbigh K. (1981). The Principles of Chemical Equilibrium: With Applications in Chemistry and Chemical Engineering. London: Cambridge University Press. pp. 55–56.
  26. Jaynes, E.T. (1989). Clearing up mysteries—the original goal, in Maximum Entropy and Bayesian Methods, J. Skilling, Editor, Kluwer Academic Publishers, Dordrecht, pp. 1–27, page 24.
  27. Grandy, Walter T. Jr. (2008). Entropy and the Time Evolution of Macroscopic Systems. Oxford University Press. pp. 55–58. ISBN   978-0-19-954617-6.
  28. Frank L. Lambert, 2002, "Disorder—A Cracked Crutch for Supporting Entropy Discussions," Journal of Chemical Education 79: 187.
  29. Carson, E. M., and Watson, J. R., (Department of Educational and Professional Studies, King's College, London), 2002, "Undergraduate students' understandings of entropy and Gibbs Free energy," University Chemistry Education – 2002 Papers, Royal Society of Chemistry.
  30. Sozbilir, Mustafa, PhD studies: Turkey, A Study of Undergraduates' Understandings of Key Chemical Ideas in Thermodynamics, Ph.D. Thesis, Department of Educational Studies, The University of York, 2001.
  31. Shannon, C.E. (1945). A mathematical theory of cryptography, Memorandum for file, MM-45-110-98, 135 pages, page 20; found in File 24 at page 203 in Claude Elwood Shannon: Miscellaneous Writings edited by N.J.A. Sloane, and Aaron D. Wyner (revision of 2013), Mathematical Sciences Research Center, AT&T Bell Laboratories, Murray Hill, NJ; previously partly published by IEEE Press.
  32. Gray, R.M. (2010). Entropy and Information Theory, Springer, New York NY, 2nd edition, p. 296.
  33. Mark Nelson (24 August 2006). "The Hutter Prize". Archived from the original on 2018-03-01. Retrieved 2008-11-27.