Entropy production (or generation) is the amount of entropy which is produced during heat process to evaluate the efficiency of the process.
Entropy is produced in irreversible processes. The importance of avoiding irreversible processes (hence reducing the entropy production) was recognized as early as 1824 by Carnot. [1] In 1865 Rudolf Clausius expanded his previous work from 1854 [2] on the concept of "unkompensierte Verwandlungen" (uncompensated transformations), which, in our modern nomenclature, would be called the entropy production. In the same article in which he introduced the name entropy, [3] Clausius gives the expression for the entropy production for a cyclical process in a closed system, which he denotes by N, in equation (71) which reads
Here S is the entropy in the final state and S0 the entropy in the initial state; S0-S is the entropy difference for the backwards part of the process. The integral is to be taken from the initial state to the final state, giving the entropy difference for the forwards part of the process. From the context, it is clear that N = 0 if the process is reversible and N > 0 in case of an irreversible process.
The laws of thermodynamics system apply to well-defined systems. Fig. 1 is a general representation of a thermodynamic system. We consider systems which, in general, are inhomogeneous. Heat and mass are transferred across the boundaries (nonadiabatic, open systems), and the boundaries are moving (usually through pistons). In our formulation we assume that heat and mass transfer and volume changes take place only separately at well-defined regions of the system boundary. The expression, given here, are not the most general formulations of the first and second law. E.g. kinetic energy and potential energy terms are missing and exchange of matter by diffusion is excluded.
The rate of entropy production, denoted by , is a key element of the second law of thermodynamics for open inhomogeneous systems which reads
Here S is the entropy of the system; Tk is the temperature at which the heat enters the system at heat flow rate ; represents the entropy flow into the system at position k, due to matter flowing into the system ( are the molar flow rate and mass flow rate and Smk and sk are the molar entropy (i.e. entropy per unit amount of substance) and specific entropy (i.e. entropy per unit mass) of the matter, flowing into the system, respectively); represents the entropy production rates due to internal processes. The subscript 'i' in refers to the fact that the entropy is produced due to irreversible processes. The entropy-production rate of every process in nature is always positive or zero. This is an essential aspect of the second law.
The Σ's indicate the algebraic sum of the respective contributions if there are more heat flows, matter flows, and internal processes.
In order to demonstrate the impact of the second law, and the role of entropy production, it has to be combined with the first law which reads
with U the internal energy of the system; the enthalpy flows into the system due to the matter that flows into the system (Hmk its molar enthalpy, hk the specific enthalpy (i.e. enthalpy per unit mass)), and dVk/dt are the rates of change of the volume of the system due to a moving boundary at position k while pk is the pressure behind that boundary; P represents all other forms of power application (such as electrical).
The first and second law have been formulated in terms of time derivatives of U and S rather than in terms of total differentials dU and dS where it is tacitly assumed that dt > 0. So, the formulation in terms of time derivatives is more elegant. An even bigger advantage of this formulation is, however, that it emphasizes that heat flow rate and power are the basic thermodynamic properties and that heat and work are derived quantities being the time integrals of the heat flow rate and the power respectively.
Entropy is produced in irreversible processes. Some important irreversible processes are:
The expression for the rate of entropy production in the first two cases will be derived in separate sections.
Most heat engines and refrigerators are closed cyclic machines. [4] In the steady state the internal energy and the entropy of the machines after one cycle are the same as at the start of the cycle. Hence, on average, dU/dt = 0 and dS/dt = 0 since U and S are functions of state. Furthermore, they are closed systems () and the volume is fixed (dV/dt = 0). This leads to a significant simplification of the first and second law:
and
The summation is over the (two) places where heat is added or removed.
For a heat engine (Fig. 2a) the first and second law obtain the form
and
Here is the heat supplied at the high temperature TH, is the heat removed at ambient temperature Ta, and P is the power delivered by the engine. Eliminating gives
The efficiency is defined by
If the performance of the engine is at its maximum and the efficiency is equal to the Carnot efficiency
For refrigerators (Fig. 2b) holds
and
Here P is the power, supplied to produce the cooling power at the low temperature TL. Eliminating now gives
The coefficient of performance of refrigerators is defined by
If the performance of the cooler is at its maximum. The COP is then given by the Carnot coefficient of performance
In both cases we find a contribution which reduces the system performance. This product of ambient temperature and the (average) entropy production rate is called the dissipated power.
It is interesting to investigate how the above mathematical formulation of the second law relates with other well-known formulations of the second law.
We first look at a heat engine, assuming that . In other words: the heat flow rate is completely converted into power. In this case the second law would reduce to
Since and this would result in which violates the condition that the entropy production is always positive. Hence: No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work. This is the Kelvin statement of the second law.
Now look at the case of the refrigerator and assume that the input power is zero. In other words: heat is transported from a low temperature to a high temperature without doing work on the system. The first law with P = 0 would give
and the second law then yields
or
Since and this would result in which again violates the condition that the entropy production is always positive. Hence: No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature. This is the Clausius statement of the second law.
In case of a heat flow rate from T1 to T2 (with ) the rate of entropy production is given by
If the heat flow is in a bar with length L, cross-sectional area A, and thermal conductivity κ, and the temperature difference is small
the entropy production rate is
In case of a volume flow rate from a pressure p1 to p2
For small pressure drops and defining the flow conductance C by we get
The dependences of on T1 − T2 and on p1 − p2 are quadratic.
This is typical for expressions of the entropy production rates in general. They guarantee that the entropy production is positive.
In this Section we will calculate the entropy of mixing when two ideal gases diffuse into each other. Consider a volume Vt divided in two volumes Va and Vb so that Vt = Va + Vb. The volume Va contains amount of substance na of an ideal gas a and Vb contains amount of substance nb of gas b. The total amount of substance is nt = na + nb. The temperature and pressure in the two volumes is the same. The entropy at the start is given by
When the division between the two gases is removed the two gases expand, comparable to a Joule–Thomson expansion. In the final state the temperature is the same as initially but the two gases now both take the volume Vt. The relation of the entropy of an amount of substance n of an ideal gas is
where CV is the molar heat capacity at constant volume and R is the molar gas constant. The system is an adiabatic closed system, so the entropy increase during the mixing of the two gases is equal to the entropy production. It is given by
As the initial and final temperature are the same, the temperature terms cancel, leaving only the volume terms. The result is
Introducing the concentration x = na/nt = Va/Vt we arrive at the well-known expression
The Joule expansion is similar to the mixing described above. It takes place in an adiabatic system consisting of a gas and two rigid vessels a and b of equal volume, connected by a valve. Initially, the valve is closed. Vessel a contains the gas while the other vessel b is empty. When the valve is opened, the gas flows from vessel a into b until the pressures in the two vessels are equal. The volume, taken by the gas, is doubled while the internal energy of the system is constant (adiabatic and no work done). Assuming that the gas is ideal, the molar internal energy is given by Um = CVT. As CV is constant, constant U means constant T. The molar entropy of an ideal gas, as function of the molar volume Vm and T, is given by
The system consisting of the two vessels and the gas is closed and adiabatic, so the entropy production during the process is equal to the increase of the entropy of the gas. So, doubling the volume with T constant gives that the molar entropy produced is
The Joule expansion provides an opportunity to explain the entropy production in statistical mechanical (i.e., microscopic) terms. At the expansion, the volume that the gas can occupy is doubled. This means that, for every molecule there are now two possibilities: it can be placed in container a or b. If the gas has amount of substance n, the number of molecules is equal to n⋅NA, where NA is the Avogadro constant. The number of microscopic possibilities increases by a factor of 2 per molecule due to the doubling of volume, so in total the factor is 2n⋅NA. Using the well-known Boltzmann expression for the entropy
where k is the Boltzmann constant and Ω is the number of microscopic possibilities to realize the macroscopic state. This gives change in molar entropy of
So, in an irreversible process, the number of microscopic possibilities to realize the macroscopic state is increased by a certain factor.
In this section we derive the basic inequalities and stability conditions for closed systems. For closed systems the first law reduces to
The second law we write as
For adiabatic systems so dS/dt ≥ 0. In other words: the entropy of adiabatic systems cannot decrease. In equilibrium the entropy is at its maximum. Isolated systems are a special case of adiabatic systems, so this statement is also valid for isolated systems.
Now consider systems with constant temperature and volume. In most cases T is the temperature of the surroundings with which the system is in good thermal contact. Since V is constant the first law gives . Substitution in the second law, and using that T is constant, gives
With the Helmholtz free energy, defined as
we get
If P = 0 this is the mathematical formulation of the general property that the free energy of systems with fixed temperature and volume tends to a minimum. The expression can be integrated from the initial state i to the final state f resulting in
where WS is the work done by the system. If the process inside the system is completely reversible the equality sign holds. Hence the maximum work, that can be extracted from the system, is equal to the free energy of the initial state minus the free energy of the final state.
Finally we consider systems with constant temperature and pressure and take P = 0. As p is constant the first laws gives
Combining with the second law, and using that T is constant, gives
With the Gibbs free energy, defined as
we get
In homogeneous systems the temperature and pressure are well-defined and all internal processes are reversible. Hence . As a result, the second law, multiplied by T, reduces to
With P = 0 the first law becomes
Eliminating and multiplying with dt gives
Since
with Gm the molar Gibbs free energy and μ the molar chemical potential we obtain the well-known result
Since physical processes can be described by stochastic processes, such as Markov chains and diffusion processes, entropy production can be defined mathematically in such processes. [5]
For a continuous-time Markov chain with instantaneous probability distribution and transition rate , the instantaneous entropy production rate is
The long-time behavior of entropy production is kept after a proper lifting of the process. This approach provides a dynamic explanation for the Kelvin statement and the Clausius statement of the second law of thermodynamics. [6]
Entropy production in diffusive-reactive system has also been studied, with interesting results emerging from diffusion, cross diffusion and reactions. [7]
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change and information systems including the transmission of information in telecommunication.
Enthalpy is the sum of a thermodynamic system's internal energy and the product of its pressure and volume. It is a state function in thermodynamics used in many measurements in chemical, biological, and physical systems at a constant external pressure, which is conveniently provided by the large ambient atmosphere. The pressure–volume term expresses the work that was done against constant external pressure to establish the system's physical dimensions from to some final volume , i.e. to make room for it by displacing its surroundings. The pressure-volume term is very small for solids and liquids at common conditions, and fairly small for gases. Therefore, enthalpy is a stand-in for energy in chemical systems; bond, lattice, solvation, and other chemical "energies" are actually enthalpy differences. As a state function, enthalpy depends only on the final configuration of internal energy, pressure, and volume, not on the path taken to achieve it.
In thermodynamics, the specific heat capacity of a substance is the amount of heat that must be added to one unit of mass of the substance in order to cause an increase of one unit in temperature. It is also referred to as massic heat capacity or as the specific heat. More formally it is the heat capacity of a sample of the substance divided by the mass of the sample. The SI unit of specific heat capacity is joule per kelvin per kilogram, J⋅kg−1⋅K−1. For example, the heat required to raise the temperature of 1 kg of water by 1 K is 4184 joules, so the specific heat capacity of water is 4184 J⋅kg−1⋅K−1.
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.
Thermal conduction is the diffusion of thermal energy (heat) within one material or between materials in contact. The higher temperature object has molecules with more kinetic energy; collisions between molecules distributes this kinetic energy until an object has the same kinetic energy throughout. Thermal conductivity, frequently represented by k, is a property that relates the rate of heat loss per unit area of a material to its rate of change of temperature. Essentially, it is a value that accounts for any property of the material that could change the way it conducts heat. Heat spontaneously flows along a temperature gradient. For example, heat is conducted from the hotplate of an electric stove to the bottom of a saucepan in contact with it. In the absence of an opposing external driving energy source, within a body or between bodies, temperature differences decay over time, and thermal equilibrium is approached, temperature becoming more uniform.
In electrochemistry, the Nernst equation is a chemical thermodynamical relationship that permits the calculation of the reduction potential of a reaction from the standard electrode potential, absolute temperature, the number of electrons involved in the redox reaction, and activities of the chemical species undergoing reduction and oxidation respectively. It was named after Walther Nernst, a German physical chemist who formulated the equation.
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.
In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work, other than pressure–volume work, that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions. The Gibbs free energy is expressed as where:
In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.
The internal energy of a thermodynamic system is the energy of the system as a state function, measured as the quantity of energy necessary to bring the system from its standard internal state to its present internal state of interest, accounting for the gains and losses of energy due to changes in its internal state, including such quantities as magnetization. It excludes the kinetic energy of motion of the system as a whole and the potential energy of position of the system as a whole, with respect to its surroundings and external force fields. It includes the thermal energy, i.e., the constituent particles' kinetic energies of motion relative to the motion of the system as a whole. The internal energy of an isolated system cannot change, as expressed in the law of conservation of energy, a foundation of the first law of thermodynamics. The notion has been introduced to describe the systems characterized by temperature variations, temperature being added to the set of state parameters, the position variables known in mechanics, in a similar way to potential energy of the conservative fields of force, gravitational and electrostatic. Its author is Rudolf Clausius. Internal energy changes equal the algebraic sum of the heat transferred and the work done. In systems without temperature changes, potential energy changes equal the work done by/on the system.
An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.
An isothermal process is a type of thermodynamic process in which the temperature T of a system remains constant: ΔT = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and a change in the system occurs slowly enough to allow the system to be continuously adjusted to the temperature of the reservoir through heat exchange (see quasi-equilibrium). In contrast, an adiabatic process is where a system exchanges no heat with its surroundings (Q = 0).
The Clausius–Clapeyron relation, in chemical thermodynamics, specifies the temperature dependence of pressure, most importantly vapor pressure, at a discontinuous phase transition between two phases of matter of a single constituent. It is named after Rudolf Clausius and Benoît Paul Émile Clapeyron. However, this relation was in fact originally derived by Sadi Carnot in his Reflections on the Motive Power of Fire, which was published in 1824 but largely ignored until it was rediscovered by Clausius, Clapeyron, and Lord Kelvin decades later. Kelvin said of Carnot's argument that "nothing in the whole range of Natural Philosophy is more remarkable than the establishment of general laws by such a process of reasoning."
The Joule expansion is an irreversible process in thermodynamics in which a volume of gas is kept in one side of a thermally isolated container, with the other side of the container being evacuated. The partition between the two parts of the container is then opened, and the gas fills the whole container.
A thermodynamic cycle consists of linked sequences of thermodynamic processes that involve transfer of heat and work into and out of the system, while varying pressure, temperature, and other state variables within the system, and that eventually returns the system to its initial state. In the process of passing through a cycle, the working fluid (system) may convert heat from a warm source into useful work, and dispose of the remaining heat to a cold sink, thereby acting as a heat engine. Conversely, the cycle may be reversed and use work to move heat from a cold source and transfer it to a warm sink thereby acting as a heat pump. If at every point in the cycle the system is in thermodynamic equilibrium, the cycle is reversible. Whether carried out reversible or irreversibly, the net entropy change of the system is zero, as entropy is a state function.
The Van 't Hoff equation relates the change in the equilibrium constant, Keq, of a chemical reaction to the change in temperature, T, given the standard enthalpy change, ΔrH⊖, for the process. The subscript means "reaction" and the superscript means "standard". It was proposed by Dutch chemist Jacobus Henricus van 't Hoff in 1884 in his book Études de Dynamique chimique.
In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.
In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.
The vaporizing droplet problem is a challenging issue in fluid dynamics. It is part of many engineering situations involving the transport and computation of sprays: fuel injection, spray painting, aerosol spray, flashing releases… In most of these engineering situations there is a relative motion between the droplet and the surrounding gas. The gas flow over the droplet has many features of the gas flow over a rigid sphere: pressure gradient, viscous boundary layer, wake. In addition to these common flow features one can also mention the internal liquid circulation phenomenon driven by surface-shear forces and the boundary layer blowing effect.