Thermodynamics |
---|
Common thermodynamic equations and quantities in thermodynamics, using mathematical notation, are as follows:
Many of the definitions below are also used in the thermodynamics of chemical reactions.
Quantity (common name/s) | (Common) symbol/s | SI unit | Dimension |
---|---|---|---|
Number of molecules | N | 1 | 1 |
Amount of substance | n | mol | N |
Temperature | T | K | Θ |
Heat Energy | Q, q | J | ML2T−2 |
Latent heat | QL | J | ML2T−2 |
Quantity (common name/s) | (Common) symbol/s | Defining equation | SI unit | Dimension |
---|---|---|---|---|
Thermodynamic beta, inverse temperature | β | J−1 | T2M−1L−2 | |
Thermodynamic temperature | τ | J | ML2T−2 | |
Entropy | S | , | J⋅K−1 | ML2T−2Θ−1 |
Pressure | P | Pa | ML−1T−2 | |
Internal Energy | U | J | ML2T−2 | |
Enthalpy | H | J | ML2T−2 | |
Partition Function | Z | 1 | 1 | |
Gibbs free energy | G | J | ML2T−2 | |
Chemical potential (of component i in a mixture) | μi | , where is not proportional to because depends on pressure. , where is proportional to (as long as the molar ratio composition of the system remains the same) because depends only on temperature and pressure and composition. | J | ML2T−2 |
Helmholtz free energy | A, F | J | ML2T−2 | |
Landau potential, Landau free energy, Grand potential | Ω, ΦG | J | ML2T−2 | |
Massieu potential, Helmholtz free entropy | Φ | J⋅K−1 | ML2T−2Θ−1 | |
Planck potential, Gibbs free entropy | Ξ | J⋅K−1 | ML2T−2Θ−1 | |
Quantity (common name/s) | (Common) symbol/s | Defining equation | SI unit | Dimension |
---|---|---|---|---|
General heat/thermal capacity | C | J⋅K−1 | ML2T−2Θ−1 | |
Heat capacity (isobaric) | Cp | J⋅K−1 | ML2T−2Θ−1 | |
Specific heat capacity (isobaric) | Cmp | J⋅kg−1⋅K−1 | L2T−2Θ−1 | |
Molar specific heat capacity (isobaric) | Cnp | J⋅K−1⋅mol−1 | ML2T−2Θ−1N−1 | |
Heat capacity (isochoric/volumetric) | CV | J⋅K−1 | ML2T−2Θ−1 | |
Specific heat capacity (isochoric) | CmV | J⋅kg−1⋅K−1 | L2T−2Θ−1 | |
Molar specific heat capacity (isochoric) | CnV | J⋅K⋅−1 mol−1 | ML2T−2Θ−1N−1 | |
Specific latent heat | L | J⋅kg−1 | L2T−2 | |
Ratio of isobaric to isochoric heat capacity, heat capacity ratio, adiabatic index, Laplace coefficient | γ | 1 | 1 | |
Quantity (common name/s) | (Common) symbol/s | Defining equation | SI unit | Dimension |
---|---|---|---|---|
Temperature gradient | No standard symbol | K⋅m−1 | ΘL−1 | |
Thermal conduction rate, thermal current, thermal/heat flux, thermal power transfer | P | W | ML2T−3 | |
Thermal intensity | I | W⋅m−2 | MT−3 | |
Thermal/heat flux density (vector analogue of thermal intensity above) | q | W⋅m−2 | MT−3 | |
The equations in this article are classified by subject.
Physical situation | Equations |
---|---|
Isentropic process (adiabatic and reversible) | For an ideal gas |
Isothermal process | For an ideal gas |
Isobaric process | p1 = p2, p = constant |
Isochoric process | V1 = V2, V = constant |
Free expansion | |
Work done by an expanding gas | Process Net work done in cyclic processes |
Physical situation | Nomenclature | Equations |
---|---|---|
Ideal gas law |
| |
Pressure of an ideal gas |
| |
Quantity | General Equation | Isobaric Δp = 0 | Isochoric ΔV = 0 | Isothermal ΔT = 0 | Adiabatic |
---|---|---|---|---|---|
Work W | |||||
Heat Capacity C | (as for real gas) | (for monatomic ideal gas)
| (for monatomic ideal gas)
| ||
Internal Energy ΔU | |||||
Enthalpy ΔH | |||||
Entropy Δs | [1] | ||||
Constant |
Below are useful results from the Maxwell–Boltzmann distribution for an ideal gas, and the implications of the Entropy quantity. The distribution is valid for atoms or molecules constituting ideal gases.
Physical situation | Nomenclature | Equations |
---|---|---|
Maxwell–Boltzmann distribution |
K2 is the modified Bessel function of the second kind. | Non-relativistic speeds Relativistic speeds (Maxwell–Jüttner distribution) |
Entropy Logarithm of the density of states |
| where: |
Entropy change | ||
Entropic force | ||
Equipartition theorem | df = degree of freedom | Average kinetic energy per degree of freedom Internal energy |
Corollaries of the non-relativistic Maxwell–Boltzmann distribution are below.
Physical situation | Nomenclature | Equations |
---|---|---|
Mean speed | ||
Root mean square speed | ||
Modal speed | ||
Mean free path |
| |
For quasi-static and reversible processes, the first law of thermodynamics is:
where δQ is the heat supplied to the system and δW is the work done by the system.
The following energies are called the thermodynamic potentials,
Name | Symbol | Formula | Natural variables |
---|---|---|---|
Internal energy | |||
Helmholtz free energy | |||
Enthalpy | |||
Gibbs free energy | |||
Landau potential, or grand potential | , |
and the corresponding fundamental thermodynamic relations or "master equations" [2] are:
Potential | Differential |
---|---|
Internal energy | |
Enthalpy | |
Helmholtz free energy | |
Gibbs free energy | |
The four most common Maxwell's relations are:
Physical situation | Nomenclature | Equations |
---|---|---|
Thermodynamic potentials as functions of their natural variables | ||
More relations include the following.
Other differential equations are:
Name | H | U | G |
---|---|---|---|
Gibbs–Helmholtz equation | |||
where N is number of particles, h is that Planck constant, I is moment of inertia, and Z is the partition function, in various forms:
Degree of freedom | Partition function |
---|---|
Translation | |
Vibration | |
Rotation |
|
Coefficients | Equation |
---|---|
Joule-Thomson coefficient | |
Compressibility (constant temperature) | |
Coefficient of thermal expansion (constant pressure) | |
Heat capacity (constant pressure) | |
Heat capacity (constant volume) | |
Derivation of heat capacity (constant pressure) |
---|
Since |
Derivation of heat capacity (constant volume) |
---|
Since (where δWrev is the work done by the system), |
Physical situation | Nomenclature | Equations |
---|---|---|
Net intensity emission/absorption |
| |
Internal energy of a substance |
| |
Meyer's equation |
| |
Effective thermal conductivities |
| Series Parallel |
Physical situation | Nomenclature | Equations |
---|---|---|
Thermodynamic engines |
| Thermodynamic engine: Carnot engine efficiency: |
Refrigeration | K = coefficient of refrigeration performance | Refrigeration performance Carnot refrigeration performance |
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change and information systems including the transmission of information in telecommunication.
In physical chemistry, the Arrhenius equation is a formula for the temperature dependence of reaction rates. The equation was proposed by Svante Arrhenius in 1889, based on the work of Dutch chemist Jacobus Henricus van 't Hoff who had noted in 1884 that the van 't Hoff equation for the temperature dependence of equilibrium constants suggests such a formula for the rates of both forward and reverse reactions. This equation has a vast and important application in determining the rate of chemical reactions and for calculation of energy of activation. Arrhenius provided a physical justification and interpretation for the formula. Currently, it is best seen as an empirical relationship. It can be used to model the temperature variation of diffusion coefficients, population of crystal vacancies, creep rates, and many other thermally induced processes and reactions. The Eyring equation, developed in 1935, also expresses the relationship between rate and energy.
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.
The van der Waals equation, named for its originator, the Dutch physicist Johannes Diderik van der Waals, is an equation of state that extends the ideal gas law to include the non-zero size of gas molecules and the interactions between them. As a result the equation is able to model the liquid–vapor phase change; it is the first equation that did this, and consequently it had a substantial impact on physics at that time. It also produces simple analytic expressions for the properties of real substances that shed light on their behavior. One way to write this equation is
In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work, other than pressure–volume work, that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions. The Gibbs free energy is expressed as where:
A thermodynamic potential is a scalar quantity used to represent the thermodynamic state of a system. Just as in mechanics, where potential energy is defined as capacity to do work, similarly different potentials have different meanings. The concept of thermodynamic potentials was introduced by Pierre Duhem in 1886. Josiah Willard Gibbs in his papers used the term fundamental functions. While thermodynamic potentials cannot be measured directly, they can be predicted using computational chemistry.
In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.
An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency of the quantity H to decrease in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.
Thermodynamics is expressed by a mathematical framework of thermodynamic equations which relate various thermodynamic quantities and physical properties measured in a laboratory or production process. Thermodynamics is based on a fundamental set of postulates, that became the laws of thermodynamics.
In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.
In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.
In the history of physics, the concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and Nicolas-Joseph Cugnot's steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.
The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.
In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like G or H (enthalpy). The relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.
In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.
The Langmuir adsorption model explains adsorption by assuming an adsorbate behaves as an ideal gas at isothermal conditions. According to the model, adsorption and desorption are reversible processes. This model even explains the effect of pressure; i.e., at these conditions the adsorbate's partial pressure is related to its volume V adsorbed onto a solid adsorbent. The adsorbent, as indicated in the figure, is assumed to be an ideal solid surface composed of a series of distinct sites capable of binding the adsorbate. The adsorbate binding is treated as a chemical reaction between the adsorbate gaseous molecule and an empty sorption site S. This reaction yields an adsorbed species with an associated equilibrium constant :