Third law of thermodynamics

Last updated

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero (zero kelvins) the system must be in a state with the minimum possible energy.

Contents

Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy. [1] In such a case, the entropy at absolute zero will be exactly zero. If the system does not have a well-defined order (if its order is glassy, for example), then there may remain some finite entropy as the system is brought to very low temperatures, either because the system becomes locked into a configuration with non-minimal energy or because the minimum energy state is non-unique. The constant value is called the residual entropy of the system. [2]

Formulations

The third law has many formulations, some more general than others, some equivalent, and some neither more general nor equivalent. [3]

The Planck statement applies only to perfect crystalline substances:

As temperature falls to zero, the entropy of any pure crystalline substance tends to a universal constant.

That is, , where is a universal constant that applies for all possible crystals, of all possible sizes, in all possible external constraints. So it can be taken as zero, giving .

The Nernst statement concerns thermodynamic processes at a fixed, low temperature, for condensed systems, which are liquids and solids:

The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.

That is, . Or equivalently,

At absolute zero, the entropy change becomes independent of the process path.

That is,

where represents a change in the state variable .

The unattainability principle of Nernst: [4]

It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations. [5]

This principle implies that cooling a system to absolute zero would require an infinite number of steps or an infinite amount of time.

The statement in adiabatic accessibility :

It is impossible to start from a state of positive temperature, and adiabatically reach a state with zero temperature.

The Einstein statement:

The entropy of any substance approaches a finite value as the temperature approaches absolute zero.

That is, where is the entropy, the zero-point entropy is finite-valued, is the temperature, and represents other relevant state variables.

This implies that the heat capacity of a substance must (uniformly) vanish at absolute zero, as otherwise the entropy would diverge.

There is also a formulation as the impossibility of "perpetual motion machines of the third kind". [3]

History

The third law was developed by chemist Walther Nernst during the years 1906 to 1912 and is therefore often referred to as the Nernst heat theorem, or sometimes the Nernst-Simon heat theorem [6] to include the contribution of Nernst's doctoral student Francis Simon. The third law of thermodynamics states that the entropy of a system at absolute zero is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.

In 1912 Nernst stated the law thus: "It is impossible for any procedure to lead to the isotherm T = 0 in a finite number of steps." [7]

An alternative version of the third law of thermodynamics was enunciated by Gilbert N. Lewis and Merle Randall in 1923:

If the entropy of each element in some (perfect) crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.

This version states not only will reach zero at 0 K, but itself will also reach zero as long as the crystal has a ground state with only one configuration. Some crystals form defects which cause a residual entropy. This residual entropy disappears when the kinetic barriers to transitioning to one ground state are overcome. [8]

With the development of statistical mechanics, the third law of thermodynamics (like the other laws) changed from a fundamental law (justified by experiments) to a derived law (derived from even more basic laws). The basic law from which it is primarily derived is the statistical-mechanics definition of entropy for a large system:

where is entropy, is the Boltzmann constant, and is the number of microstates consistent with the macroscopic configuration. The counting of states is from the reference state of absolute zero, which corresponds to the entropy of .

Explanation

In simple terms, the third law states that the entropy of a perfect crystal of a pure substance approaches zero as the temperature approaches zero. The alignment of a perfect crystal leaves no ambiguity as to the location and orientation of each part of the crystal. As the energy of the crystal is reduced, the vibrations of the individual atoms are reduced to nothing, and the crystal becomes the same everywhere.

(a) Single possible configuration for a system at absolute zero, i.e., only one microstate is accessible. Thus S = k ln W = 0. (b) At temperatures greater than absolute zero, multiple microstates are accessible due to atomic vibration (exaggerated in the figure). Since the number of accessible microstates is greater than 1, S = k ln W > 0. Figure Showing Entropy at 0 K.png
(a) Single possible configuration for a system at absolute zero, i.e., only one microstate is accessible. Thus S = k ln W = 0. (b) At temperatures greater than absolute zero, multiple microstates are accessible due to atomic vibration (exaggerated in the figure). Since the number of accessible microstates is greater than 1, S = k ln W > 0.

The third law provides an absolute reference point for the determination of entropy at any other temperature. The entropy of a closed system, determined relative to this zero point, is then the absolute entropy of that system. Mathematically, the absolute entropy of any system at zero temperature is the natural log of the number of ground states times the Boltzmann constant kB = 1.38×10−23 J K−1.

The entropy of a perfect crystal lattice as defined by Nernst's theorem is zero provided that its ground state is unique, because ln(1) = 0. If the system is composed of one-billion atoms that are all alike and lie within the matrix of a perfect crystal, the number of combinations of one billion identical things taken one billion at a time is Ω = 1. Hence:

The difference is zero; hence the initial entropy S0 can be any selected value so long as all other such calculations include that as the initial entropy. As a result, the initial entropy value of zero is selected S0 = 0 is used for convenience.

Example: Entropy change of a crystal lattice heated by an incoming photon

Suppose a system consisting of a crystal lattice with volume V of N identical atoms at T = 0 K, and an incoming photon of wavelength λ and energy ε.

Initially, there is only one accessible microstate:

Let us assume the crystal lattice absorbs the incoming photon. There is a unique atom in the lattice that interacts and absorbs this photon. So after absorption, there are N possible microstates accessible by the system, each corresponding to one excited atom, while the other atoms remain at ground state.

The entropy, energy, and temperature of the closed system rises and can be calculated. The entropy change is

From the second law of thermodynamics:

Hence

Calculating entropy change:

We assume N = 3 × 1022 and λ = 1 cm. The energy change of the system as a result of absorbing the single photon whose energy is ε:

The temperature of the closed system rises by

This can be interpreted as the average temperature of the system over the range from . [9] A single atom is assumed to absorb the photon, but the temperature and entropy change characterizes the entire system.

Systems with non-zero entropy at absolute zero

An example of a system that does not have a unique ground state is one whose net spin is a half-integer, for which time-reversal symmetry gives two degenerate ground states. For such systems, the entropy at zero temperature is at least kB ln(2) (which is negligible on a macroscopic scale). Some crystalline systems exhibit geometrical frustration, where the structure of the crystal lattice prevents the emergence of a unique ground state. Ground-state helium (unless under pressure) remains liquid.

Glasses and solid solutions retain significant entropy at 0 K, because they are large collections of nearly degenerate states, in which they become trapped out of equilibrium.[ citation needed ] Another example of a solid with many nearly-degenerate ground states, trapped out of equilibrium, is ice Ih, which has "proton disorder".

For the entropy at absolute zero to be zero, the magnetic moments of a perfectly ordered crystal must themselves be perfectly ordered; from an entropic perspective, this can be considered to be part of the definition of a "perfect crystal". Only ferromagnetic, antiferromagnetic, and diamagnetic materials can satisfy this condition. However, ferromagnetic materials do not, in fact, have zero entropy at zero temperature, because the spins of the unpaired electrons are all aligned and this gives a ground-state spin degeneracy. Materials that remain paramagnetic at 0 K, by contrast, may have many nearly degenerate ground states (for example, in a spin glass), or may retain dynamic disorder (a quantum spin liquid).[ citation needed ]

Consequences

Fig. 1 Left side: Absolute zero can be reached in a finite number of steps if S(0, X1) [?] S(0, X2). Right: An infinite number of steps is needed since S(0, X1) = S(0, X2). Can T=0 be reached.jpg
Fig. 1 Left side: Absolute zero can be reached in a finite number of steps if S(0, X1) ≠ S(0, X2). Right: An infinite number of steps is needed since S(0, X1) = S(0, X2).

Absolute zero

The third law is equivalent to the statement that

It is impossible by any procedure, no matter how idealized, to reduce the temperature of any closed system to zero temperature in a finite number of finite operations. [10]

The reason that T = 0 cannot be reached according to the third law is explained as follows: Suppose that the temperature of a substance can be reduced in an isentropic process by changing the parameter X from X2 to X1. One can think of a multistage nuclear demagnetization setup where a magnetic field is switched on and off in a controlled way. [11] If there were an entropy difference at absolute zero, T = 0 could be reached in a finite number of steps. However, at T = 0 there is no entropy difference, so an infinite number of steps would be needed. The process is illustrated in Fig. 1.

Example: magnetic refrigeration

Gadolinium alloy heats up inside the magnetic field and loses thermal energy to the environment, so it exits the field and becomes cooler than when it entered. Magnetocaloric1.01cr.png
Gadolinium alloy heats up inside the magnetic field and loses thermal energy to the environment, so it exits the field and becomes cooler than when it entered.

To be concrete, we imagine that we are refrigerating magnetic material. Suppose we have a large bulk of paramagnetic salt and an adjustable external magnetic field in the vertical direction.

Let the parameter represent the external magnetic field. At the same temperature, if the external magnetic field is strong, then the internal atoms in the salt would strongly align with the field, so the disorder (entropy) would decrease. Therefore, in Fig. 1, the curve for is the curve for lower magnetic field, and the curve for is the curve for higher magnetic field.

The refrigeration process repeats the following two steps:

  • Isothermal process. Here, we have a chunk of salt in magnetic field and temperature . We divide the chunk into two parts: a large part playing the role of "environment", and a small part playing the role of "system". We slowly increase the magnetic field on the system to , but keep the magnetic field constant on the environment. The atoms in the system would lose directional degrees of freedom (DOF), and the energy in the directional DOF would be squeezed out into the vibrational DOF. This makes it slightly hotter, and then it would lose thermal energy to the environment, to remain in the same temperature .
  • (The environment is now discarded.)
  • Isentropic cooling. Here, the system is wrapped in adiathermal covering, and the external magnetic field is slowly lowered to . This frees up the direction DOF, absorbing some energy from the vibrational DOF. The effect is that the system has the same entropy, but reaches a lower temperature .

At every two-step of the process, the mass of the system decreases, as we discard more and more salt as the "environment". However, if the equations of state for this salt is as shown in Fig. 1 (left), then we can start with a large but finite amount of salt, and end up with a small piece of salt that has .

Specific heat

A non-quantitative description of his third law that Nernst gave at the very beginning was simply that the specific heat of a material can always be made zero by cooling it down far enough. [12] A modern, quantitative analysis follows.

Suppose that the heat capacity of a sample in the low temperature region has the form of a power law C(T,X) = C0Tα asymptotically as T → 0, and we wish to find which values of α are compatible with the third law. We have

By the discussion of third law above, this integral must be bounded as T0 → 0, which is only possible if α > 0. So the heat capacity must go to zero at absolute zero

if it has the form of a power law. The same argument shows that it cannot be bounded below by a positive constant, even if we drop the power-law assumption.[ citation needed ]

On the other hand, the molar specific heat at constant volume of a monatomic classical ideal gas, such as helium at room temperature, is given by CV = (3/2)R with R the molar ideal gas constant. But clearly a constant heat capacity does not satisfy Eq. ( 12 ). That is, a gas with a constant heat capacity all the way to absolute zero violates the third law of thermodynamics. We can verify this more fundamentally by substituting CV in Eq. ( 14 ), which yields

In the limit T0 → 0 this expression diverges, again contradicting the third law of thermodynamics.

The conflict is resolved as follows: At a certain temperature the quantum nature of matter starts to dominate the behavior. Fermi particles follow Fermi–Dirac statistics and Bose particles follow Bose–Einstein statistics. In both cases the heat capacity at low temperatures is no longer temperature independent, even for ideal gases. For Fermi gases

with the Fermi temperature TF given by

Here NA is the Avogadro constant, Vm the molar volume, and M the molar mass.

For Bose gases

with TB given by

The specific heats given by Eq. ( 14 ) and ( 16 ) both satisfy Eq. ( 12 ). Indeed, they are power laws with α = 1 and α = 3/2 respectively.

Even within a purely classical setting, the density of a classical ideal gas at fixed particle number becomes arbitrarily high as T goes to zero, so the interparticle spacing goes to zero. The assumption of non-interacting particles presumably breaks down when they are sufficiently close together, so the value of CV gets modified away from its ideal constant value.[ citation needed ]

Vapor pressure

The only liquids near absolute zero are 3He and 4He. Their heat of evaporation has a limiting value given by

with L0 and Cp constant. If we consider a container partly filled with liquid and partly gas, the entropy of the liquid–gas mixture is

where Sl(T) is the entropy of the liquid and x is the gas fraction. Clearly the entropy change during the liquid–gas transition (x from 0 to 1) diverges in the limit of T→0. This violates Eq. ( 8 ). Nature solves this paradox as follows: at temperatures below about 100 mK, the vapor pressure is so low that the gas density is lower than the best vacuum in the universe. In other words, below 100 mK there is simply no gas above the liquid. [13] :91

Miscibility

If liquid helium with mixed 3He and 4He were cooled to absolute zero, the liquid must have zero entropy. This either means they are ordered perfectly as a mixed liquid, which is impossible for a liquid, or that they fully separate out into two layers of pure liquid. This is precisely what happens.

For example, if a solution with 3 3He to 2 4He atoms were cooled, it would start the separation at 0.9 K, purifying more and more, until at absolute zero, when the upper layer becomes purely 3He, and the lower layer becomes purely 4He. [13] :129

Surface tension

Let be the surface tension of liquid, then the entropy per area is . So if a liquid can exist down to absolute zero, then since its entropy is constant no matter its shape at absolute zero, its entropy per area must converge to zero. That is, its surface tension would become constant at low temperatures. [13] :87 In particular, the surface tension of 3He is well-approximated by for some parameters . [14]

Latent heat of melting

The melting curves of 3He and 4He both extend down to absolute zero at finite pressure. At the melting pressure, liquid and solid are in equilibrium. The third law demands that the entropies of the solid and liquid are equal at T = 0. As a result, the latent heat of melting is zero, and the slope of the melting curve extrapolates to zero as a result of the Clausius–Clapeyron equation. [13] :140

Thermal expansion coefficient

The thermal expansion coefficient is defined as

With the Maxwell relation

and Eq. ( 8 ) with X = p it is shown that

So the thermal expansion coefficient of all materials must go to zero at zero kelvin.

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Enthalpy</span> Measure of energy in a thermodynamic system

Enthalpy is the sum of a thermodynamic system's internal energy and the product of its pressure and volume. It is a state function in thermodynamics used in many measurements in chemical, biological, and physical systems at a constant external pressure, which is conveniently provided by the large ambient atmosphere. The pressure–volume term expresses the work that was done against constant external pressure to establish the system's physical dimensions from to some final volume , i.e. to make room for it by displacing its surroundings. The pressure-volume term is very small for solids and liquids at common conditions, and fairly small for gases. Therefore, enthalpy is a stand-in for energy in chemical systems; bond, lattice, solvation, and other chemical "energies" are actually enthalpy differences. As a state function, enthalpy depends only on the final configuration of internal energy, pressure, and volume, not on the path taken to achieve it.

<span class="mw-page-title-main">Thermodynamic free energy</span> State function whose change relates to the systems maximal work output

In thermodynamics, the thermodynamic free energy is one of the state functions of a thermodynamic system. The change in the free energy is the maximum amount of work that the system can perform in a process at constant temperature, and its sign indicates whether the process is thermodynamically favorable or forbidden. Since free energy usually contains potential energy, it is not absolute but depends on the choice of a zero point. Therefore, only relative free energy values, or changes in free energy, are physically meaningful.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

In electrochemistry, the Nernst equation is a chemical thermodynamical relationship that permits the calculation of the reduction potential of a reaction from the standard electrode potential, absolute temperature, the number of electrons involved in the redox reaction, and activities of the chemical species undergoing reduction and oxidation respectively. It was named after Walther Nernst, a German physical chemist who formulated the equation.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

In chemistry, the standard molar entropy is the entropy content of one mole of pure substance at a standard state of pressure and any temperature of interest. These are often chosen to be the standard temperature and pressure.

<span class="mw-page-title-main">Nernst heat theorem</span>

The Nernst heat theorem was formulated by Walther Nernst early in the twentieth century and was used in the development of the third law of thermodynamics.

<span class="mw-page-title-main">Gibbs free energy</span> Type of thermodynamic potential

In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work, other than pressure-volume work, that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions. The Gibbs free energy is expressed asWhere:

<span class="mw-page-title-main">Internal energy</span> Energy contained within a system

The internal energy of a thermodynamic system is the energy of the system as a state function, measured as the quantity of energy necessary to bring the system from its standard internal state to its present internal state of interest, accounting for the gains and losses of energy due to changes in its internal state, including such quantities as magnetization. It excludes the kinetic energy of motion of the system as a whole and the potential energy of position of the system as a whole, with respect to its surroundings and external force fields. It includes the thermal energy, i.e., the constituent particles' kinetic energies of motion relative to the motion of the system as a whole. The internal energy of an isolated system cannot change, as expressed in the law of conservation of energy, a foundation of the first law of thermodynamics. The notion has been introduced to describe the systems characterized by temperature variations, temperature being added to the set of state parameters, the position variables known in mechanics, in a similar way to potential energy of the conservative fields of force, gravitational and electrostatic. Internal energy changes equal the algebraic sum of the heat transferred and the work done. In systems without temperature changes, potential energy changes equal the work done by/on the system.

<span class="mw-page-title-main">Negative temperature</span> Physical systems hotter than any other

Certain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This phenomenon was first discovered at the University of Alberta. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero. A system with a truly negative temperature on the Kelvin scale is hotter than any system with a positive temperature. If a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system. A standard example of such a system is population inversion in laser physics.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

<span class="mw-page-title-main">Thermodynamic equations</span> Equations in thermodynamics

Thermodynamics is expressed by a mathematical framework of thermodynamic equations which relate various thermodynamic quantities and physical properties measured in a laboratory or production process. Thermodynamics is based on a fundamental set of postulates, that became the laws of thermodynamics.

<span class="mw-page-title-main">Joule expansion</span>

The Joule expansion is an irreversible process in thermodynamics in which a volume of gas is kept in one side of a thermally isolated container, with the other side of the container being evacuated. The partition between the two parts of the container is then opened, and the gas fills the whole container.

In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.

In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

<span class="mw-page-title-main">Entropy production</span> Development of entropy in a thermodynamic system

Entropy production is the amount of entropy which is produced during heat process to evaluate the efficiency of the process.

References

  1. J. Wilks The Third Law of Thermodynamics Oxford University Press (1961).[ page needed ]
  2. Kittel and Kroemer, Thermal Physics (2nd ed.), page 49.
  3. 1 2 Klimenko, A. Y. (29 June 2012). "Teaching the third law of thermodynamics". The Open Thermodynamics Journal. 6 (1): 1–14. arXiv: 1208.4189 . doi:10.2174/1874396X01206010001.
  4. Masanes, Lluís; Oppenheim, Jonathan (14 March 2017). "A general derivation and quantification of the third law of thermodynamics". Nature Communications. 8 (1): 14538. arXiv: 1412.3828 . Bibcode:2017NatCo...814538M. doi:10.1038/ncomms14538. ISSN   2041-1723. PMC   5355879 . PMID   28290452.
  5. Wilks, J. (1971). The Third Law of Thermodynamics, Chapter 6 in Thermodynamics, volume 1, ed. W. Jost, of H. Eyring, D. Henderson, W. Jost, Physical Chemistry. An Advanced Treatise, Academic Press, New York, page 477.
  6. Wheeler, John C. (1 May 1991). "Nonequivalence of the Nernst-Simon and unattainability statements of the third law of thermodynamics". Physical Review A. 43 (10): 5289–5295. Bibcode:1991PhRvA..43.5289W. doi:10.1103/PhysRevA.43.5289. PMID   9904841 . Retrieved 1 August 2023.
  7. Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics, New York, ISBN   0-88318-797-3, page 342.
  8. Kozliak, Evguenii; Lambert, Frank L. (2008). "Residual Entropy, the Third Law and Latent Heat". Entropy. 10 (3): 274–84. Bibcode:2008Entrp..10..274K. doi: 10.3390/e10030274 .
  9. Reynolds and Perkins (1977). Engineering Thermodynamics . McGraw Hill. pp.  438. ISBN   978-0-07-052046-2.
  10. Guggenheim, E.A. (1967). Thermodynamics. An Advanced Treatment for Chemists and Physicists, fifth revised edition, North-Holland Publishing Company, Amsterdam, page 157.
  11. Pobell, Frank (2007). Matter and Methods at Low Temperatures. Berlin: Springer-Verlag. ISBN   978-3-662-08580-6.[ page needed ]
  12. Einstein and the Quantum, A. Douglas Stone, Princeton University Press, 2013.
  13. 1 2 3 4 Pippard, Alfred B. (1981). Elements of classical thermodynamics: for advanced students of physics (Repr ed.). Cambridge: Univ. Pr. ISBN   978-0-521-09101-5.
  14. Suzuki, M; Okuda, Y; Ikushima, A. J; Iino, M (15 February 1988). "Surface Tension of Liquid 3He from 0.4 K down to 15 mK". Europhysics Letters (EPL). 5 (4): 333–337. Bibcode:1988EL......5..333S. doi:10.1209/0295-5075/5/4/009. ISSN   0295-5075.

Further reading