# Boltzmann constant

Last updated

Boltzmann constant
Symbol:kB
Value in joules per kelvin:1.380649×10−23 JK−1 [1]

The Boltzmann constant (kB or k) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. [2] It occurs in the definitions of the kelvin and the gas constant, and in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann constant has dimensions of energy divided by temperature, the same as entropy. It is named after the Austrian scientist Ludwig Boltzmann.

## Contents

As part of the 2019 redefinition of SI base units, the Boltzmann constant is one of the seven "defining constants" that have been given exact definitions. They are used in various combinations to define the seven SI base units. The Boltzmann constant is defined to be exactly 1.380649×10−23 JK−1. [1]

## Roles of the Boltzmann constant

Macroscopically, the ideal gas law states that, for an ideal gas, the product of pressure p and volume V is proportional to the product of amount of substance n (in moles) and absolute temperature T:

${\displaystyle pV=nRT,}$

where R is the molar gas constant (8.31446261815324 J⋅K−1⋅mol−1). [3] Introducing the Boltzmann constant as the gas constant per molecule [4] k = R/NA transforms the ideal gas law into an alternative form:

${\displaystyle pV=NkT,}$

where N is the number of molecules of gas.

### Role in the equipartition of energy

Given a thermodynamic system at an absolute temperature T, the average thermal energy carried by each microscopic degree of freedom in the system is 1/2kT (i.e., about 2.07×10−21 J, or 0.013  eV , at room temperature). This is generally true only for classical systems with a large number of particles, and in which quantum effects are negligible.

In classical statistical mechanics, this average is predicted to hold exactly for homogeneous ideal gases. Monatomic ideal gases (the six noble gases) possess three degrees of freedom per atom, corresponding to the three spatial directions. According to the equipartition of energy this means that there is a thermal energy of 3/2kT per atom. This corresponds very well with experimental data. The thermal energy can be used to calculate the root-mean-square speed of the atoms, which turns out to be inversely proportional to the square root of the atomic mass. The root mean square speeds found at room temperature accurately reflect this, ranging from 1370 m/s for helium, down to 240 m/s for xenon.

Kinetic theory gives the average pressure p for an ideal gas as

${\displaystyle p={\frac {1}{3}}{\frac {N}{V}}m{\overline {v^{2}}}.}$

Combination with the ideal gas law

${\displaystyle pV=NkT}$

shows that the average translational kinetic energy is

${\displaystyle {\tfrac {1}{2}}m{\overline {v^{2}}}={\tfrac {3}{2}}kT.}$

Considering that the translational motion velocity vector v has three degrees of freedom (one for each dimension) gives the average energy per degree of freedom equal to one third of that, i.e. 1/2kT.

The ideal gas equation is also obeyed closely by molecular gases; but the form for the heat capacity is more complicated, because the molecules possess additional internal degrees of freedom, as well as the three degrees of freedom for movement of the molecule as a whole. Diatomic gases, for example, possess a total of six degrees of simple freedom per molecule that are related to atomic motion (three translational, two rotational, and one vibrational). At lower temperatures, not all these degrees of freedom may fully participate in the gas heat capacity, due to quantum mechanical limits on the availability of excited states at the relevant thermal energy per molecule.

### Role in Boltzmann factors

More generally, systems in equilibrium at temperature T have probability Pi of occupying a state i with energy E weighted by the corresponding Boltzmann factor:

${\displaystyle P_{i}\propto {\frac {\exp \left(-{\frac {E}{kT}}\right)}{Z}},}$

where Z is the partition function. Again, it is the energy-like quantity that takes central importance.

Consequences of this include (in addition to the results for ideal gases above) the Arrhenius equation in chemical kinetics.

### Role in the statistical definition of entropy

In statistical mechanics, the entropy S of an isolated system at thermodynamic equilibrium is defined as the natural logarithm of W, the number of distinct microscopic states available to the system given the macroscopic constraints (such as a fixed total energy E):

${\displaystyle S=k\,\ln W.}$

This equation, which relates the microscopic details, or microstates, of the system (via W) to its macroscopic state (via the entropy S), is the central idea of statistical mechanics. Such is its importance that it is inscribed on Boltzmann's tombstone.

The constant of proportionality k serves to make the statistical mechanical entropy equal to the classical thermodynamic entropy of Clausius:

${\displaystyle \Delta S=\int {\frac {{\rm {d}}Q}{T}}.}$

One could choose instead a rescaled dimensionless entropy in microscopic terms such that

${\displaystyle {S'=\ln W},\quad \Delta S'=\int {\frac {\mathrm {d} Q}{kT}}.}$

This is a more natural form and this rescaled entropy exactly corresponds to Shannon's subsequent information entropy.

The characteristic energy kT is thus the energy required to increase the rescaled entropy by one nat.

### The thermal voltage

In semiconductors, the Shockley diode equation—the relationship between the flow of electric current and the electrostatic potential across a p–n junction—depends on a characteristic voltage called the thermal voltage, denoted by VT. The thermal voltage depends on absolute temperature T as

${\displaystyle V_{\mathrm {T} }={kT \over q},}$

where q is the magnitude of the electrical charge on the electron with a value 1.602176634×10−19 C. [5] Equivalently,

${\displaystyle {V_{\mathrm {T} } \over T}={k \over q}\approx 8.61733034\times 10^{-5}\ \mathrm {V/K} .}$

At room temperature 300 K (27 °C; 80 °F), VT is approximately 25.85 mV [6] [7] which can be derived by plugging in the values as follows:

${\displaystyle V_{\mathrm {T} }={kT \over q}={\frac {1.38\times 10^{-23}\ \mathrm {J{\cdot }K^{-1}} \times 300\ \mathrm {K} }{1.6\times 10^{-19}\ \mathrm {C} }}\simeq 25.85\ \mathrm {mV} }$

At the standard state temperature of 298.15 K (25.00 °C; 77.00 °F), it is approximately 25.69 mV. The thermal voltage is also important in plasmas and electrolyte solutions (e.g. the Nernst equation); in both cases it provides a measure of how much the spatial distribution of electrons or ions is affected by a boundary held at a fixed voltage. [8] [9]

## History

The Boltzmann constant is named after its 19th century Austrian discoverer, Ludwig Boltzmann. Although Boltzmann first linked entropy and probability in 1877, the relation was never expressed with a specific constant until Max Planck first introduced k, and gave a more precise value for it (1.346×10−23 J/K, about 2.5% lower than today's figure), in his derivation of the law of black-body radiation in 1900–1901. [10] Before 1900, equations involving Boltzmann factors were not written using the energies per molecule and the Boltzmann constant, but rather using a form of the gas constant R, and macroscopic energies for macroscopic quantities of the substance. The iconic terse form of the equation S = k ln W on Boltzmann's tombstone is in fact due to Planck, not Boltzmann. Planck actually introduced it in the same work as his eponymous h. [11]

In 1920, Planck wrote in his Nobel Prize lecture: [12]

This constant is often referred to as Boltzmann's constant, although, to my knowledge, Boltzmann himself never introduced it – a peculiar state of affairs, which can be explained by the fact that Boltzmann, as appears from his occasional utterances, never gave thought to the possibility of carrying out an exact measurement of the constant.

This "peculiar state of affairs" is illustrated by reference to one of the great scientific debates of the time. There was considerable disagreement in the second half of the nineteenth century as to whether atoms and molecules were real or whether they were simply a heuristic tool for solving problems. There was no agreement whether chemical molecules, as measured by atomic weights, were the same as physical molecules, as measured by kinetic theory. Planck's 1920 lecture continued: [12]

Nothing can better illustrate the positive and hectic pace of progress which the art of experimenters has made over the past twenty years, than the fact that since that time, not only one, but a great number of methods have been discovered for measuring the mass of a molecule with practically the same accuracy as that attained for a planet.

In versions of SI prior to the 2019 redefinition of the SI base units, the Boltzmann constant was a measured quantity rather than a fixed value. Its exact definition also varied over the years due to redefinitions of the kelvin (see Kelvin § History) and other SI base units (see Joule § History).

In 2017, the most accurate measures of the Boltzmann constant were obtained by acoustic gas thermometry, which determines the speed of sound of a monatomic gas in a triaxial ellipsoid chamber using microwave and acoustic resonances. [13] [14] This decade-long effort was undertaken with different techniques by several laboratories; [lower-alpha 1] it is one of the cornerstones of the 2019 redefinition of SI base units. Based on these measurements, the CODATA recommended 1.380649×10−23 J/K to be the final fixed value of the Boltzmann constant to be used for the International System of Units. [15]

## Value in different units

1.380649×10−23 J/K SI by definition, J/K = m2⋅kg/(s2⋅K) in SI base units
8.617333262×10−5 eV/K [note 1]
2.083661912×1010 Hz/K(k/h) [note 1]
1.380649×10−16 erg/K CGS system, 1  erg = 1×10−7 J
3.297623483×10−24 cal/K [note 1] 1  calorie = 4.1868 J
1.832013046×10−24cal/°R [note 1]
5.657302466×10−24 ftlb/°R [note 1]
0.695034800 cm−1/K(k/(hc)) [note 1]
3.166811563×10−6Eh/K(Eh = hartree)
1.987204259×10−3 kcal/(mol⋅K)(kNA) [note 1]
8.314462618×10−3kJ/(mol⋅K)(kNA) [note 1]
−228.5991672 dB(W/K/Hz)10 log10(k/(1 W/K/Hz)), [note 1] used for thermal noise calculations
1.536179187×10−40kg/KIn geometrized units, k/c2 where c is the speed of light [16]

Since k is a proportionality factor between temperature and energy, its numerical value depends on the choice of units for energy and temperature. The small numerical value of the Boltzmann constant in SI units means a change in temperature by 1 K only changes a particle's energy by a small amount. A change of 1  °C is defined to be the same as a change of 1 K. The characteristic energy kT is a term encountered in many physical relationships.

The Boltzmann constant sets up a relationship between wavelength and temperature (dividing hc/k by a wavelength gives a temperature) with one micrometer being related to 14387.777 K, and also a relationship between voltage and temperature (kT in units of eV corresponds to a voltage) with one volt being related to 11604.518 K. The ratio of these two temperatures, 14387.777 K / 11604.518 K  1.239842, is the numerical value of hc in units of eV⋅μm.

### Natural units

The Boltzmann constant provides a mapping from this characteristic microscopic energy E to the macroscopic temperature scale T = E/k. In fundamental physics, this mapping is often simplified by using the natural units of setting k to unity. This convention means that temperature and energy quantities have the same dimensions. [17] [18] In particular, the SI unit kelvin becomes superfluous, being defined in terms of joules as 1 K = 1.380649×10−23 J. [19] With this convention, temperature is always given in units of energy, and the Boltzmann constant is not explicitly needed in formulas. [17]

This convention simplifies many physical relationships and formulas. For example, the equipartition formula for the energy associated with each classical degree of freedom (${\displaystyle {\tfrac {1}{2}}kT}$ above) becomes

${\displaystyle E_{\mathrm {dof} }={\tfrac {1}{2}}T}$

As another example, the definition of thermodynamic entropy coincides with the form of information entropy:

${\displaystyle S=-\sum _{i}P_{i}\ln P_{i}.}$

where Pi is the probability of each microstate.

## Notes

1. Independent techniques exploited: acoustic gas thermometry, dielectric constant gas thermometry, johnson noise thermometry. Involved laboratories cited by CODATA in 2017: LNE-Cnam (France), NPL (UK), INRIM (Italy), PTB (Germany), NIST (USA), NIM (China).
1. The value is exact but not expressible as a finite decimal; approximated to 9 decimal places only.

## Related Research Articles

Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

Thermodynamic temperature is a quantity defined in thermodynamics as distinct from kinetic theory or statistical mechanics.

The Stefan–Boltzmann constant, a physical constant denoted by the Greek letter σ (sigma), is the constant of proportionality in the Stefan–Boltzmann law: "the total intensity radiated over all wavelengths increases as the temperature increases", of a black body which is proportional to the fourth power of the thermodynamic temperature. The theory of thermal radiation lays down the theory of quantum mechanics, by using physics to relate to molecular, atomic and sub-atomic levels. Slovenian physicist Josef Stefan formulated the constant in 1879; it was formally derived in 1884 by his former student and collaborator, the Austrian physicist Ludwig Boltzmann. The equation can also be derived from Planck's law, by integrating over all wavelengths at a given temperature, which will represent a small flat black body box. "The amount of thermal radiation emitted increases quickly and the principal frequency of the radiation becomes higher with increasing temperatures". The Stefan–Boltzmann constant can be used to measure the amount of heat that is emitted by a black body, which absorbs all of the radiant energy that hits it, and will emit all the radiant energy. Furthermore, the Stefan–Boltzmann constant allows for units of temperature (K) to be converted to units of intensity (W⋅m−2), which is power per unit area.

The molar gas constant is denoted by the symbol R or R. It is the molar equivalent to the Boltzmann constant, expressed in units of energy per temperature increment per amount of substance, i.e. the pressure–volume product, rather than energy per temperature increment per particle. The constant is also a combination of the constants from Boyle's law, Charles's law, Avogadro's law, and Gay-Lussac's law. It is a physical constant that is featured in many fundamental equations in the physical sciences, such as the ideal gas law, the Arrhenius equation, and the Nernst equation.

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects, unless energy in some form is supplied to reverse the direction of heat flow. Another definition is: "Not all heat energy can be converted into work in a cyclic process."

The third law of thermodynamics states, regarding the properties of closed systems in thermodynamic equilibrium:

The entropy of a system approaches a constant value when its temperature approaches absolute zero.

In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions.

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general, and are applicable in other natural sciences.

The Sackur–Tetrode equation is an expression for the entropy of a monatomic ideal gas.

The Joule expansion is an irreversible process in thermodynamics in which a volume of gas is kept in one side of a thermally isolated container, with the other side of the container being evacuated. The partition between the two parts of the container is then opened, and the gas fills the whole container.

In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system:

In statistical mechanics, a microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of its thermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume and density. Treatments on statistical mechanics define a macrostate as follows: a particular set of values of energy, the number of particles, and the volume of an isolated thermodynamic system is said to specify a particular macrostate of it. In this description, microstates appear as different possible ways the system can achieve a particular macrostate.

In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-nineteenth century from the Greek word τρoπή (transformation) to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of a large ensembles of microstates that constitute thermodynamic systems.

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, you can pour cream into coffee and mix it, but you cannot "unmix" it; you can burn a piece of wood, but you cannot "unburn" it. The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measured with a thermometer.

Entropy production is the amount of entropy which is produced in any irreversible processes such as heat and mass transfer processes including motion of bodies, heat exchange, fluid flow, substances expanding or mixing, anelastic deformation of solids, and any irreversible thermodynamic cycle, including thermal machines such as power plants, heat engines, refrigerators, heat pumps, and air conditioners.

## References

1. Newell, David B.; Tiesinga, Eite (2019). The International System of Units (SI). NIST Special Publication 330. Gaithersburg, Maryland: National Institute of Standards and Technology. doi:10.6028/nist.sp.330-2019. S2CID   242934226.
2. Richard Feynman (1970). The Feynman Lectures on Physics Vol I. Addison Wesley Longman. ISBN   978-0-201-02115-8.
3. "Proceedings of the 106th meeting" (PDF). 16–20 October 2017.
4. Petrucci, Ralph H.; Harwood, William S.; Herring, F. Geoffrey (2002). GENERAL CHEMISTRY: Principles and Modern Applications (8th ed.). Prentice Hall. p. 785. ISBN   0-13-014329-4.
5. "2018 CODATA Value: elementary charge". The NIST Reference on Constants, Units, and Uncertainty. NIST. 20 May 2019. Retrieved 20 May 2019.
6. Rashid, Muhammad H. (2016). Microelectronic circuits: analysis and design (Third ed.). Cengage Learning. pp. 183–184. ISBN   9781305635166.
7. Cataldo, Enrico; Lieto, Alberto Di; Maccarrone, Francesco; Paffuti, Giampiero (18 August 2016). "Measurements and analysis of current-voltage characteristic of a pn diode for an undergraduate physics laboratory". arXiv: [physics.ed-ph].
8. Kirby, Brian J. (2009). Micro- and Nanoscale Fluid Mechanics: Transport in Microfluidic Devices (PDF). Cambridge University Press. ISBN   978-0-521-11903-0.
9. Tabeling, Patrick (2006). . Oxford University Press. ISBN   978-0-19-856864-3.
10. Planck, Max (1901), "Ueber das Gesetz der Energieverteilung im Normalspectrum", Ann. Phys. , 309 (3): 553–63, Bibcode:1901AnP...309..553P, doi:. English translation: "On the Law of Distribution of Energy in the Normal Spectrum". Archived from the original on 17 December 2008.
11. Gearhart, Clayton A. (2002). "Planck, the Quantum, and the Historians". Physics in Perspective. 4 (2): 177. Bibcode:2002PhP.....4..170G. doi:10.1007/s00016-002-8363-7. ISSN   1422-6944. S2CID   26918826.
12. Pitre, L; Sparasci, F; Risegari, L; Guianvarc’h, C; Martin, C; Himbert, M E; Plimmer, M D; Allard, A; Marty, B; Giuliano Albo, P A; Gao, B; Moldover, M R; Mehl, J B (1 December 2017). "New measurement of the Boltzmann constant by acoustic thermometry of helium-4 gas" (PDF). Metrologia. 54 (6): 856–873. Bibcode:2017Metro..54..856P. doi:10.1088/1681-7575/aa7bf5. hdl:11696/57295. S2CID   53680647. Archived from the original (PDF) on 5 March 2019.
13. de Podesta, Michael; Mark, Darren F; Dymock, Ross C; Underwood, Robin; Bacquart, Thomas; Sutton, Gavin; Davidson, Stuart; Machin, Graham (1 October 2017). "Re-estimation of argon isotope ratios leading to a revised estimate of the Boltzmann constant" (PDF). Metrologia. 54 (5): 683–692. Bibcode:2017Metro..54..683D. doi:10.1088/1681-7575/aa7880. S2CID   125912713.
14. Newell, D. B.; Cabiati, F.; Fischer, J.; Fujii, K.; Karshenboim, S. G.; Margolis, H. S.; Mirandés, E. de; Mohr, P. J.; Nez, F. (2018). "The CODATA 2017 values of h, e, k, and N A for the revision of the SI". Metrologia. 55 (1): L13. Bibcode:2018Metro..55L..13N. doi:. ISSN   0026-1394.
15. Kalinin, M; Kononogov, S (2005), "Boltzmann's Constant, the Energy Meaning of Temperature, and Thermodynamic Irreversibility", Measurement Techniques, 48 (7): 632–36, doi:10.1007/s11018-005-0195-9, S2CID   118726162
16. Kittel, Charles; Kroemer, Herbert (1980). Thermal physics (2nd ed.). San Francisco: W.H. Freeman. p. 41. ISBN   0716710889. We prefer to use a more natural temperature scale [...] the fundamental temperature has the units of energy.
17. Mohr, Peter J; Shirley, Eric L; Phillips, William D; Trott, Michael (1 October 2022). "On the dimension of angles and their units". Metrologia. 59 (5): 053001. arXiv:. Bibcode:2022Metro..59e3001M. doi:.