Table of thermodynamic equations

Last updated

Common thermodynamic equations and quantities in thermodynamics, using mathematical notation, are as follows:

Contents

Definitions

Many of the definitions below are also used in the thermodynamics of chemical reactions.

General basic quantities

Quantity (common name/s)(Common) symbol/sSI unitDimension
Number of moleculesN11
Amount of substancenmolN
Temperature TKΘ
Heat Energy Q, qJML2T−2
Latent heat QLJML2T−2

General derived quantities

Quantity (common name/s)(Common) symbol/sDefining equationSI unitDimension
Thermodynamic beta, inverse temperatureβJ−1T2M−1L−2
Thermodynamic temperature τ

JML2T−2
Entropy S

,

J⋅K−1ML2T−2Θ−1
Pressure P

PaML−1T−2
Internal Energy UJML2T−2
Enthalpy HJML2T−2
Partition Function Z11
Gibbs free energy GJML2T−2
Chemical potential (of component i in a mixture)μi

, where is not proportional to because depends on pressure. , where is proportional to (as long as the molar ratio composition of the system remains the same) because depends only on temperature and pressure and composition.

JML2T−2
Helmholtz free energy A, FJML2T−2
Landau potential, Landau free energy, Grand potential Ω, ΦGJML2T−2
Massieu potential, Helmholtz free entropy ΦJ⋅K−1ML2T−2Θ−1
Planck potential, Gibbs free entropy ΞJ⋅K−1ML2T−2Θ−1

Thermal properties of matter

Quantity (common name/s)(Common) symbol/sDefining equationSI unitDimension
General heat/thermal capacityCJ⋅K−1ML2T−2Θ−1
Heat capacity (isobaric)CpJ⋅K−1ML2T−2Θ−1
Specific heat capacity (isobaric)CmpJ⋅kg−1⋅K−1L2T−2Θ−1
Molar specific heat capacity (isobaric)CnpJ⋅K−1⋅mol−1ML2T−2Θ−1N−1
Heat capacity (isochoric/volumetric)CVJ⋅K−1ML2T−2Θ−1
Specific heat capacity (isochoric)CmVJ⋅kg−1⋅K−1L2T−2Θ−1
Molar specific heat capacity (isochoric)CnVJ⋅K⋅−1 mol−1ML2T−2Θ−1N−1
Specific latent heat LJ⋅kg−1L2T−2
Ratio of isobaric to isochoric heat capacity, heat capacity ratio, adiabatic index, Laplace coefficientγ11

Thermal transfer

Quantity (common name/s)(Common) symbol/sDefining equationSI unitDimension
Temperature gradient No standard symbolK⋅m−1ΘL−1
Thermal conduction rate, thermal current, thermal/heat flux, thermal power transferPWML2T−3
Thermal intensityIW⋅m−2MT−3
Thermal/heat flux density (vector analogue of thermal intensity above)qW⋅m−2MT−3

Equations

The equations in this article are classified by subject.

Thermodynamic processes

Physical situationEquations
Isentropic process (adiabatic and reversible)

For an ideal gas


Isothermal process

For an ideal gas

Isobaric process p1 = p2, p = constant

Isochoric process V1 = V2, V = constant

Free expansion
Work done by an expanding gasProcess

Net work done in cyclic processes

Kinetic theory

Ideal gas equations
Physical situationNomenclatureEquations
Ideal gas law

Pressure of an ideal gas
  • m = mass of one molecule
  • Mm = molar mass

Ideal gas

QuantityGeneral EquationIsobaric
Δp = 0
Isochoric
ΔV = 0
Isothermal
ΔT = 0
Adiabatic
Work
W

Heat Capacity
C
(as for real gas)
(for monatomic ideal gas)


(for diatomic ideal gas)


(for monatomic ideal gas)


(for diatomic ideal gas)

Internal Energy
ΔU






Enthalpy
ΔH
Entropy
Δs

[1]

Constant

Entropy

Statistical physics

Below are useful results from the Maxwell–Boltzmann distribution for an ideal gas, and the implications of the Entropy quantity. The distribution is valid for atoms or molecules constituting ideal gases.

Physical situationNomenclatureEquations
Maxwell–Boltzmann distribution
  • v = velocity of atom/molecule,
  • m = mass of each molecule (all molecules are identical in kinetic theory),
  • γ(p) = Lorentz factor as function of momentum (see below)
  • Ratio of thermal to rest mass-energy of each molecule:

K2 is the modified Bessel function of the second kind.

Non-relativistic speeds

Relativistic speeds (Maxwell–Jüttner distribution)

Entropy Logarithm of the density of states
  • Pi = probability of system in microstate i
  • Ω = total number of microstates

where:

Entropy change

Entropic force
Equipartition theorem df = degree of freedomAverage kinetic energy per degree of freedom

Internal energy

Corollaries of the non-relativistic Maxwell–Boltzmann distribution are below.

Physical situationNomenclatureEquations
Mean speed
Root mean square speed
Modal speed
Mean free path
  • σ = effective cross-section
  • n = volume density of number of target particles
  • = mean free path

Quasi-static and reversible processes

For quasi-static and reversible processes, the first law of thermodynamics is:

where δQ is the heat supplied to the system and δW is the work done by the system.

Thermodynamic potentials

The following energies are called the thermodynamic potentials,

NameSymbolFormulaNatural variables
Internal energy
Helmholtz free energy
Enthalpy
Gibbs free energy
Landau potential, or
grand potential
,

and the corresponding fundamental thermodynamic relations or "master equations" [2] are:

PotentialDifferential
Internal energy
Enthalpy
Helmholtz free energy
Gibbs free energy

Maxwell's relations

The four most common Maxwell's relations are:

Physical situationNomenclatureEquations
Thermodynamic potentials as functions of their natural variables

More relations include the following.

Other differential equations are:

NameHUG
Gibbs–Helmholtz equation

Quantum properties

where N is number of particles, h is that Planck constant, I is moment of inertia, and Z is the partition function, in various forms:

Degree of freedomPartition function
Translation
Vibration
Rotation

Thermal properties of matter

CoefficientsEquation
Joule-Thomson coefficient
Compressibility (constant temperature)
Coefficient of thermal expansion (constant pressure)
Heat capacity (constant pressure)
Heat capacity (constant volume)

Thermal transfer

Physical situationNomenclatureEquations
Net intensity emission/absorption
  • Texternal = external temperature (outside of system)
  • Tsystem = internal temperature (inside system)
  • ε = emissivity
Internal energy of a substance
  • CV = isovolumetric heat capacity of substance
  • ΔT = temperature change of substance
Meyer's equation
  • Cp = isobaric heat capacity
  • CV = isovolumetric heat capacity
  • n = amount of substance
Effective thermal conductivities
  • λi = thermal conductivity of substance i
  • λnet = equivalent thermal conductivity
Series

Parallel

Thermal efficiencies

Physical situationNomenclatureEquations
Thermodynamic engines
  • η = efficiency
  • W = work done by engine
  • QH = heat energy in higher temperature reservoir
  • QL = heat energy in lower temperature reservoir
  • TH = temperature of higher temp. reservoir
  • TL = temperature of lower temp. reservoir
Thermodynamic engine:

Carnot engine efficiency:

RefrigerationK = coefficient of refrigeration performanceRefrigeration performance

Carnot refrigeration performance

See also

Related Research Articles

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change and information systems including the transmission of information in telecommunication.

In physical chemistry, the Arrhenius equation is a formula for the temperature dependence of reaction rates. The equation was proposed by Svante Arrhenius in 1889, based on the work of Dutch chemist Jacobus Henricus van 't Hoff who had noted in 1884 that the van 't Hoff equation for the temperature dependence of equilibrium constants suggests such a formula for the rates of both forward and reverse reactions. This equation has a vast and important application in determining the rate of chemical reactions and for calculation of energy of activation. Arrhenius provided a physical justification and interpretation for the formula. Currently, it is best seen as an empirical relationship. It can be used to model the temperature variation of diffusion coefficients, population of crystal vacancies, creep rates, and many other thermally induced processes and reactions. The Eyring equation, developed in 1935, also expresses the relationship between rate and energy.

<span class="mw-page-title-main">Ideal gas</span> Mathematical model which approximates the behavior of real gases

An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

<span class="mw-page-title-main">Maxwell–Boltzmann statistics</span> Statistical distribution used in many-particle mechanics

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.

<span class="mw-page-title-main">Van der Waals equation</span> Gas equation of state which accounts for non-ideal gas behavior

The van der Waals equation, named for its originator, the Dutch physicist Johannes Diderik van der Waals, is an equation of state that extends the ideal gas law to include the non-zero size of gas molecules and the interactions between them. As a result the equation is able to model the liquid–vapor phase change; it is the first equation that did this, and consequently it had a substantial impact on physics at that time. It also produces simple analytic expressions for the properties of real substances that shed light on their behavior. One way to write this equation is

<span class="mw-page-title-main">Gibbs free energy</span> Type of thermodynamic potential

In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work, other than pressure–volume work, that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions. The Gibbs free energy is expressed as where:

<span class="mw-page-title-main">Thermodynamic potential</span> Scalar physical quantities representing system states

A thermodynamic potential is a scalar quantity used to represent the thermodynamic state of a system. Just as in mechanics, where potential energy is defined as capacity to do work, similarly different potentials have different meanings. The concept of thermodynamic potentials was introduced by Pierre Duhem in 1886. Josiah Willard Gibbs in his papers used the term fundamental functions. While thermodynamic potentials cannot be measured directly, they can be predicted using computational chemistry.

<span class="mw-page-title-main">Partition function (statistical mechanics)</span> Function in thermodynamics and statistical physics

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

<span class="mw-page-title-main">Isentropic process</span> Thermodynamic process that is reversible and adiabatic

An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency of the quantity H to decrease in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

<span class="mw-page-title-main">Thermodynamic equations</span> Equations in thermodynamics

Thermodynamics is expressed by a mathematical framework of thermodynamic equations which relate various thermodynamic quantities and physical properties measured in a laboratory or production process. Thermodynamics is based on a fundamental set of postulates, that became the laws of thermodynamics.

In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

In the history of physics, the concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and Nicolas-Joseph Cugnot's steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

<span class="mw-page-title-main">Fundamental thermodynamic relation</span> Equations on thermodynamic quantities

In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like G or H (enthalpy). The relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Langmuir adsorption model</span> Model describing the adsorption of a mono-layer of gas molecules on an ideal flat surface

The Langmuir adsorption model explains adsorption by assuming an adsorbate behaves as an ideal gas at isothermal conditions. According to the model, adsorption and desorption are reversible processes. This model even explains the effect of pressure; i.e., at these conditions the adsorbate's partial pressure is related to its volume V adsorbed onto a solid adsorbent. The adsorbent, as indicated in the figure, is assumed to be an ideal solid surface composed of a series of distinct sites capable of binding the adsorbate. The adsorbate binding is treated as a chemical reaction between the adsorbate gaseous molecule and an empty sorption site S. This reaction yields an adsorbed species with an associated equilibrium constant :

References

  1. Keenan, Thermodynamics, Wiley, New York, 1947
  2. Physical chemistry, P.W. Atkins, Oxford University Press, 1978, ISBN   0 19 855148 7