Thermal reservoir

Last updated

A thermal reservoir, also thermal energy reservoir or thermal bath, is a thermodynamic system with a heat capacity so large that the temperature of the reservoir changes relatively little when a significant amount of heat is added or extracted. [1] As a conceptual simplification, it effectively functions as an infinite pool of thermal energy at a given, constant temperature. Since it can act as an inertial source and sink of heat, it is often also referred to as a heat reservoir or heat bath.

Lakes, oceans and rivers often serve as thermal reservoirs in geophysical processes, such as the weather. In atmospheric science, large air masses in the atmosphere often function as thermal reservoirs.

Since the temperature of a thermal reservoir T does not change during the heat transfer, the change of entropy in the reservoir is

The microcanonical partition sum of a heat bath of temperature T has the property

where is the Boltzmann constant. It thus changes by the same factor when a given amount of energy is added. The exponential factor in this expression can be identified with the reciprocal of the Boltzmann factor.

For an engineering application, see geothermal heat pump.

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Heat engine</span> System that converts heat or thermal energy to mechanical work

A heat engine is a system that converts heat to usable energy, particularly mechanical energy, which can then be used to do mechanical work. While originally conceived in the context of mechanical energy, the concept of the heat engine has been applied to various other kinds of energy, particularly electrical, since at least the late 19th century. The heat engine does this by bringing a working substance from a higher state temperature to a lower state temperature. A heat source generates thermal energy that brings the working substance to the higher temperature state. The working substance generates work in the working body of the engine while transferring heat to the colder sink until it reaches a lower temperature state. During this process some of the thermal energy is converted into work by exploiting the properties of the working substance. The working substance can be any system with a non-zero heat capacity, but it usually is a gas or liquid. During this process, some heat is normally lost to the surroundings and is not converted to work. Also, some energy is unusable because of friction and drag.

<span class="mw-page-title-main">Calorimeter</span> Instrument for measuring heat

A calorimeter is an object used for calorimetry, or the process of measuring the heat of chemical reactions or physical changes as well as heat capacity. Differential scanning calorimeters, isothermal micro calorimeters, titration calorimeters and accelerated rate calorimeters are among the most common types. A simple calorimeter just consists of a thermometer attached to a metal container full of water suspended above a combustion chamber. It is one of the measurement devices used in the study of thermodynamics, chemistry, and biochemistry.

In physical chemistry, the Arrhenius equation is a formula for the temperature dependence of reaction rates. The equation was proposed by Svante Arrhenius in 1889, based on the work of Dutch chemist Jacobus Henricus van 't Hoff who had noted in 1884 that the van 't Hoff equation for the temperature dependence of equilibrium constants suggests such a formula for the rates of both forward and reverse reactions. This equation has a vast and important application in determining the rate of chemical reactions and for calculation of energy of activation. Arrhenius provided a physical justification and interpretation for the formula. Currently, it is best seen as an empirical relationship. It can be used to model the temperature variation of diffusion coefficients, population of crystal vacancies, creep rates, and many other thermally induced processes and reactions. The Eyring equation, developed in 1935, also expresses the relationship between rate and energy.

<span class="mw-page-title-main">Boltzmann constant</span> Physical constant relating particle kinetic energy with temperature

The Boltzmann constant is the proportionality factor that relates the average relative thermal energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, and in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann constant has dimensions of energy divided by temperature, the same as entropy. It is named after the Austrian scientist Ludwig Boltzmann.

Conduction is the process by which heat is transferred from the hotter end to the colder end of an object. The ability of the object to conduct heat is known as its thermal conductivity, and is denoted k.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

<span class="mw-page-title-main">Johnson–Nyquist noise</span> Electronic noise due to thermal vibration within a conductor

Johnson–Nyquist noise is the electronic noise generated by the thermal agitation of the charge carriers inside an electrical conductor at equilibrium, which happens regardless of any applied voltage. Thermal noise is present in all electrical circuits, and in sensitive electronic equipment can drown out weak signals, and can be the limiting factor on sensitivity of electrical measuring instruments. Thermal noise increases with temperature. Some sensitive electronic equipment such as radio telescope receivers are cooled to cryogenic temperatures to reduce thermal noise in their circuits. The generic, statistical physical derivation of this noise is called the fluctuation-dissipation theorem, where generalized impedance or generalized susceptibility is used to characterize the medium.

<span class="mw-page-title-main">Heat capacity</span> Physical property describing the energy required to change a materials temperature

Heat capacity or thermal capacity is a physical property of matter, defined as the amount of heat to be supplied to an object to produce a unit change in its temperature. The SI unit of heat capacity is joule per kelvin (J/K).

<span class="mw-page-title-main">Helmholtz free energy</span> Thermodynamic potential

In thermodynamics, the Helmholtz free energy is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature (isothermal). The change in the Helmholtz energy during a process is equal to the maximum amount of work that the system can perform in a thermodynamic process in which temperature is held constant. At constant temperature, the Helmholtz free energy is minimized at equilibrium.

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

<span class="mw-page-title-main">Equipartition theorem</span> Theorem in classical statistical mechanics

In classical statistical mechanics, the equipartition theorem relates the temperature of a system to its average energies. The equipartition theorem is also known as the law of equipartition, equipartition of energy, or simply equipartition. The original idea of equipartition was that, in thermal equilibrium, energy is shared equally among all of its various forms; for example, the average kinetic energy per degree of freedom in translational motion of a molecule should equal that in rotational motion.

In thermal engineering, the logarithmic mean temperature difference (LMTD) is used to determine the temperature driving force for heat transfer in flow systems, most notably in heat exchangers. The LMTD is a logarithmic average of the temperature difference between the hot and cold feeds at each end of the double pipe exchanger. For a given heat exchanger with constant area and heat transfer coefficient, the larger the LMTD, the more heat is transferred. The use of the LMTD arises straightforwardly from the analysis of a heat exchanger with constant flow rate and fluid thermal properties.

<span class="mw-page-title-main">Thermodynamic cycle</span> Linked cyclic series of thermodynamic processes

A thermodynamic cycle consists of linked sequences of thermodynamic processes that involve transfer of heat and work into and out of the system, while varying pressure, temperature, and other state variables within the system, and that eventually returns the system to its initial state. In the process of passing through a cycle, the working fluid (system) may convert heat from a warm source into useful work, and dispose of the remaining heat to a cold sink, thereby acting as a heat engine. Conversely, the cycle may be reversed and use work to move heat from a cold source and transfer it to a warm sink thereby acting as a heat pump. If at every point in the cycle the system is in thermodynamic equilibrium, the cycle is reversible. Whether carried out reversible or irreversibly, the net entropy change of the system is zero, as entropy is a state function.

In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

<span class="mw-page-title-main">Carnot cycle</span> Idealized thermodynamic cycle

A Carnot cycle is an ideal thermodynamic cycle proposed by French physicist Sadi Carnot in 1824 and expanded upon by others in the 1830s and 1840s. By Carnot's theorem, it provides an upper limit on the efficiency of any classical thermodynamic engine during the conversion of heat into work, or conversely, the efficiency of a refrigeration system in creating a temperature difference through the application of work to the system.

<span class="mw-page-title-main">Heat</span> Type of energy transfer

In thermodynamics, heat is the thermal energy transferred between systems due to a temperature difference. In colloquial use, heat sometimes refers to thermal energy itself. Thermal energy is the kinetic energy of vibrating and colliding atoms in a substance.

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

Phonon noise, also known as thermal fluctuation noise, arises from the random exchange of energy between a thermal mass and its surrounding environment. This energy is quantized in the form of phonons. Each phonon has an energy of order , where is the Boltzmann constant and is the temperature. The random exchange of energy leads to fluctuations in temperature. This occurs even when the thermal mass and the environment are in thermal equilibrium, i.e. at the same time-average temperature. If a device has a temperature-dependent electrical resistance, then these fluctuations in temperature lead to fluctuations in resistance. Examples of devices where phonon noise is important include bolometers and calorimeters. The superconducting transition edge sensor (TES), which can be operated either as a bolometer or a calorimeter, is an example of a device for which phonon noise can significantly contribute to the total noise.

References

  1. C, Yunus A.; Boles, Michael A. (2002). Thermodynamics: An Engineering Approach. Boston: McGraw-Hill. p. 247. ISBN   0-07-121688-X.