Thermodynamic beta

Last updated

SI temperature/coldness conversion scale: Temperatures in Kelvin scale are shown in blue (Celsius scale in green, Fahrenheit scale in red), coldness values in gigabyte per nanojoule are shown in black. Infinite temperature (coldness zero) is shown at the top of the diagram; positive values of coldness/temperature are on the right-hand side, negative values on the left-hand side. ColdnessScale.svg
SI temperature/coldness conversion scale: Temperatures in Kelvin scale are shown in blue (Celsius scale in green, Fahrenheit scale in red), coldness values in gigabyte per nanojoule are shown in black. Infinite temperature (coldness zero) is shown at the top of the diagram; positive values of coldness/temperature are on the right-hand side, negative values on the left-hand side.

In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system:

Contents

(where T is the temperature and kB is Boltzmann constant). [1]

It was originally introduced in 1971 (as Kältefunktion "coldness function") by Ingo Müller  [ de ], one of the proponents of the rational thermodynamics school of thought, [2] [3] based on earlier proposals for a "reciprocal temperature" function. [4] [5]

Thermodynamic beta has units reciprocal to that of energy (in SI units, reciprocal joules, ). In non-thermal units, it can also be measured in byte per joule, or more conveniently, gigabyte per nanojoule; [6] 1 K−1 is equivalent to about 13,062 gigabytes per nanojoule; at room temperature: T = 300K, β ≈ 44 GB/nJ39  eV −12.4×1020 J−1. The conversion factor is 1 GB/nJ = J−1.

Description

Thermodynamic beta is essentially the connection between the information theory and statistical mechanics interpretation of a physical system through its entropy and the thermodynamics associated with its energy. It expresses the response of entropy to an increase in energy. If a system is challenged with a small amount of energy, then β describes the amount the system will randomize.

Via the statistical definition of temperature as a function of entropy, the coldness function can be calculated in the microcanonical ensemble from the formula

(i.e., the partial derivative of the entropy S with respect to the energy E at constant volume V and particle number N).

Advantages

Though completely equivalent in conceptual content to temperature, β is generally considered a more fundamental quantity than temperature owing to the phenomenon of negative temperature, in which β is continuous as it crosses zero whereas T has a singularity. [7]

In addition, β has the advantage of being easier to understand causally: If a small amount of heat is added to a system, β is the increase in entropy divided by the increase in heat. Temperature is difficult to interpret in the same sense, as it is not possible to "Add entropy" to a system except indirectly, by modifying other quantities such as temperature, volume, or number of particles.

Statistical interpretation

From the statistical point of view, β is a numerical quantity relating two macroscopic systems in equilibrium. The exact formulation is as follows. Consider two systems, 1 and 2, in thermal contact, with respective energies E1 and E2. We assume E1 + E2 = some constant E. The number of microstates of each system will be denoted by Ω1 and Ω2. Under our assumptions Ωi depends only on Ei. We also assume that any microstate of system 1 consistent with E1 can coexist with any microstate of system 2 consistent with E2. Thus, the number of microstates for the combined system is

We will derive β from the fundamental assumption of statistical mechanics:

When the combined system reaches equilibrium, the number Ω is maximized.

(In other words, the system naturally seeks the maximum number of microstates.) Therefore, at equilibrium,

But E1 + E2 = E implies

So

i.e.

The above relation motivates a definition of β:

Connection of statistical view with thermodynamic view

When two systems are in equilibrium, they have the same thermodynamic temperature T. Thus intuitively, one would expect β (as defined via microstates) to be related to T in some way. This link is provided by Boltzmann's fundamental assumption written as

where kB is the Boltzmann constant, S is the classical thermodynamic entropy, and Ω is the number of microstates. So

Substituting into the definition of β from the statistical definition above gives

Comparing with thermodynamic formula

we have

where is called the fundamental temperature of the system, and has units of energy.

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects, unless energy in some form is supplied to reverse the direction of heat flow. Another definition is: "Not all heat energy can be converted into work in a cyclic process."

<span class="mw-page-title-main">Maxwell–Boltzmann statistics</span> Statistical distribution used in many-particle mechanics

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.

<span class="mw-page-title-main">Third law of thermodynamics</span> Law of physics

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

<span class="mw-page-title-main">Negative temperature</span> Physical systems hotter than any other

Certain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero.

<span class="mw-page-title-main">Onsager reciprocal relations</span> Relations between flows and forces, or gradients, in thermodynamic systems

In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists.

The fluctuation–dissipation theorem (FDT) or fluctuation–dissipation relation (FDR) is a powerful tool in statistical physics for predicting the behavior of systems that obey detailed balance. Given that a system obeys detailed balance, the theorem is a proof that thermodynamic fluctuations in a physical variable predict the response quantified by the admittance or impedance of the same physical variable, and vice versa. The fluctuation–dissipation theorem applies both to classical and quantum mechanical systems.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general, and are applicable in other natural sciences.

<span class="mw-page-title-main">Bose gas</span> State of matter of many bosons

An ideal Bose gas is a quantum-mechanical phase of matter, analogous to a classical ideal gas. It is composed of bosons, which have an integer value of spin, and abide by Bose–Einstein statistics. The statistical mechanics of bosons were developed by Satyendra Nath Bose for a photon gas, and extended to massive particles by Albert Einstein who realized that an ideal gas of bosons would form a condensate at a low enough temperature, unlike a classical ideal gas. This condensate is known as a Bose–Einstein condensate.

In statistical mechanics, the grand canonical ensemble is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibrium with a reservoir. The system is said to be open in the sense that the system can exchange energy and particles with a reservoir, so that various possible states of the system can differ in both their total energy and total number of particles. The system's volume, shape, and other external coordinates are kept the same in all possible states of the system.

In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that the energy of the system does not change with time.

The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of its thermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume and density. Treatments on statistical mechanics define a macrostate as follows: a particular set of values of energy, the number of particles, and the volume of an isolated thermodynamic system is said to specify a particular macrostate of it. In this description, microstates appear as different possible ways the system can achieve a particular macrostate.

The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

<span class="mw-page-title-main">Fundamental thermodynamic relation</span>

In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like G or H (enthalpy). The relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, someone can pour cream into coffee and mix it, but they cannot "unmix" it; they can burn a piece of wood, but cannot "unburn" it. The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Thermal fluctuations</span> Random temperature-influenced deviations of particles from their average state

In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.

The Gibbs rotational ensemble represents the possible states of a mechanical system in thermal and rotational equilibrium at temperature and angular velocity . The Jaynes procedure can be used to obtain this ensemble. An ensemble is the set of microstates corresponding to a given macrostate.

References

  1. J. Meixner (1975) "Coldness and Temperature", Archive for Rational Mechanics and Analysis57:3, 281-290 abstract.
  2. Müller, Ingo (1971). "Die Kältefunktion, eine universelle Funktion in der Thermodynamik wärmeleitender Flüssigkeiten" [The cold function, a universal function in the thermodynamics of heat-conducting liquids]. Archive for Rational Mechanics and Analysis. 40: 1–36. doi:10.1007/BF00281528.
  3. Müller, Ingo (1971). "The Coldness, a Universal Function in Thermoelastic Bodies". Archive for Rational Mechanics and Analysis. 41: 319–332. doi:10.1007/BF00281870.
  4. Day, W.A. and Gurtin, Morton E. (1969) "On the symmetry of the conductivity tensor and other restrictions in the nonlinear theory of heat conduction", Archive for Rational Mechanics and Analysis33:1, 26-32 (Springer-Verlag) abstract.
  5. J. Castle, W. Emmenish, R. Henkes, R. Miller, and J. Rayne (1965) Science by Degrees: Temperature from Zero to Zero (Westinghouse Search Book Series, Walker and Company, New York).
  6. P. Fraundorf (2003) "Heat capacity in bits", Amer. J. Phys.71:11, 1142-1151.
  7. Kittel, Charles; Kroemer, Herbert (1980), Thermal Physics (2 ed.), United States of America: W. H. Freeman and Company, ISBN   978-0471490302