The Joule expansion (a subset of free expansion) is an irreversible process in thermodynamics in which a volume of gas is kept in one side of a thermally isolated container (via a small partition), with the other side of the container being evacuated. The partition between the two parts of the container is then opened, and the gas fills the whole container.
The Joule expansion, treated as a thought experiment involving ideal gases, is a useful exercise in classical thermodynamics. It provides a convenient example for calculating changes in thermodynamic quantities, including the resulting increase in entropy of the universe (entropy production) that results from this inherently irreversible process. An actual Joule expansion experiment necessarily involves real gases; the temperature change in such a process provides a measure of intermolecular forces.
This type of expansion is named after James Prescott Joule who used this expansion, in 1845, in his study for the mechanical equivalent of heat, but this expansion was known long before Joule e.g. by John Leslie, in the beginning of the 19th century, and studied by Joseph Louis Gay-Lussac in 1807 with similar results as obtained by Joule. [1] [2]
The Joule expansion should not be confused with the Joule–Thomson expansion or throttling process which refers to the steady flow of a gas from a region of higher pressure to one of lower pressure via a valve or porous plug.
The process begins with gas under some pressure, , at temperature , confined to one half of a thermally isolated container (see the top part of the drawing at the beginning of this article). The gas occupies an initial volume , mechanically separated from the other part of the container, which has a volume , and is under near zero pressure. The tap (solid line) between the two halves of the container is then suddenly opened, and the gas expands to fill the entire container, which has a total volume of (see the bottom part of the drawing). A thermometer inserted into the compartment on the left (not shown in the drawing) measures the temperature of the gas before and after the expansion.
The system in this experiment consists of both compartments; that is, the entire region occupied by the gas at the end of the experiment. Because this system is thermally isolated, it cannot exchange heat with its surroundings. Also, since the system's total volume is kept constant, the system cannot perform work on its surroundings. [3] As a result, the change in internal energy, , is zero. Internal energy consists of internal kinetic energy (due to the motion of the molecules) and internal potential energy (due to intermolecular forces). When the molecular motion is random, temperature is the measure of the internal kinetic energy. In this case, the internal kinetic energy is called heat. If the chambers have not reached equilibrium, there will be some kinetic energy of flow, which is not detectable by a thermometer (and therefore is not a component of heat). Thus, a change in temperature indicates a change in kinetic energy, and some of this change will not appear as heat until and unless thermal equilibrium is reestablished. When heat is transferred into kinetic energy of flow, this causes a decrease in temperature. [4] In practice, the simple two-chamber free expansion experiment often incorporates a 'porous plug' through which the expanding air must flow to reach the lower pressure chamber. The purpose of this plug is to inhibit directional flow, thereby quickening the reestablishment of thermal equilibrium. Since the total internal energy does not change, the stagnation of flow in the receiving chamber converts kinetic energy of flow back into random motion (heat) so that the temperature climbs to its predicted value. If the initial air temperature is low enough that non-ideal gas properties cause condensation, some internal energy is converted into latent heat (an offsetting change in potential energy) in the liquid products. Thus, at low temperatures the Joule expansion process provides information on intermolecular forces.
If the gas is ideal, both the initial (, , ) and final (, , ) conditions follow the Ideal Gas Law, so that initially and then, after the tap is opened,
Here is the number of moles of gas and is the molar ideal gas constant. Because the internal energy does not change and the internal energy of an ideal gas is solely a function of temperature, the temperature of the gas does not change; therefore . This implies that
Therefore if the volume doubles, the pressure halves.
The fact that the temperature does not change makes it easy to compute the change in entropy of the universe for this process.
Unlike ideal gases, the temperature of a real gas will change during a Joule expansion. At temperatures below their inversion temperature gases will cool during Joule expansion, while at higher temperatures they will heat up. [5] [6] The inversion temperature of a gas is typically much higher than room temperature; exceptions are helium, with an inversion temperature of about 40 K, and hydrogen, with an inversion temperature of about 200 K. Since the internal energy of the gas during Joule expansion is constant, cooling must be due to the conversion of internal kinetic energy to internal potential energy, with the opposite being the case for warming.
Intermolecular forces are repulsive at short range and attractive at long range (for example, see the Lennard-Jones potential). Since distances between gas molecules are large compared to molecular diameters, the energy of a gas is usually influenced mainly by the attractive part of the potential. As a result, expanding a gas usually increases the potential energy associated with intermolecular forces. Some textbooks say that for gases this must always be the case and that a Joule expansion must always produce cooling. [7] [8] When molecules are close together, however, repulsive interactions are much more important and it is thus possible to get an increase in temperature during a Joule expansion. [9]
It is theoretically predicted that, at sufficiently high temperature, all gases will warm during a Joule expansion [5] The reason is that at any moment, a very small number of molecules will be undergoing collisions; for those few molecules, repulsive forces will dominate and the potential energy will be positive. As the temperature rises, both the frequency of collisions and the energy involved in the collisions increase, so the positive potential energy associated with collisions increases strongly. If the temperature is high enough, that can make the total potential energy positive, in spite of the much larger number of molecules experiencing weak attractive interactions. When the potential energy is positive, a constant energy expansion reduces potential energy and increases kinetic energy, resulting in an increase in temperature. This behavior has only been observed for hydrogen and helium; which have very weak attractive interactions. For other gases this "Joule inversion temperature" appears to be extremely high. [6]
Entropy is a function of state, and therefore the entropy change can be computed directly from the knowledge of the final and initial equilibrium states. For an ideal gas, the change in entropy [10] is the same as for isothermal expansion where all heat is converted to work:
For an ideal monatomic gas, the entropy as a function of the internal energy U, volume V, and number of moles n is given by the Sackur–Tetrode equation: [11]
In this expression m is the particle mass and h is the Planck constant. For a monatomic ideal gas U = 3/2nRT = nCVT, with CV the molar heat capacity at constant volume.
A second way to evaluate the entropy change is to choose a route from the initial state to the final state where all the intermediate states are in equilibrium. Such a route can only be realized in the limit where the changes happen infinitely slowly. Such routes are also referred to as quasistatic routes. In some books one demands that a quasistatic route has to be reversible, here we don't add this extra condition. The net entropy change from the initial state to the final state is independent of the particular choice of the quasistatic route, as the entropy is a function of state.
Here is how we can effect the quasistatic route. Instead of letting the gas undergo a free expansion in which the volume is doubled, a free expansion is allowed in which the volume expands by a very small amount δV. After thermal equilibrium is reached, we then let the gas undergo another free expansion by δV and wait until thermal equilibrium is reached. We repeat this until the volume has been doubled. In the limit δV to zero, this becomes an ideal quasistatic process, albeit an irreversible one. Now, according to the fundamental thermodynamic relation, we have:
As this equation relates changes in thermodynamic state variables, it is valid for any quasistatic change, regardless of whether it is irreversible or reversible. For the above defined path we have that dU = 0 and thus T dS = P dV, and hence the increase in entropy for the Joule expansion is
A third way to compute the entropy change involves a route consisting of reversible adiabatic expansion followed by heating. We first let the system undergo a reversible adiabatic expansion in which the volume is doubled. During the expansion, the system performs work and the gas temperature goes down, so we have to supply heat to the system equal to the work performed to bring it to the same final state as in case of Joule expansion.
During the reversible adiabatic expansion, we have dS = 0. From the classical expression for the entropy it can be derived that the temperature after the doubling of the volume at constant entropy is given as: for the monoatomic ideal gas. Heating the gas up to the initial temperature Ti increases the entropy by the amount
We might ask what the work would be if, once the Joule expansion has occurred, the gas is put back into the left-hand side by compressing it. The best method (i.e. the method involving the least work) is that of a reversible isothermal compression, which would take work W given by
During the Joule expansion the surroundings do not change, i.e. the entropy of the surroundings is constant. Therefore the entropy change of the so-called "universe" is equal to the entropy change of the gas which is nR ln 2.
Joule performed his experiment with air at room temperature which was expanded from a pressure of about 22 bar. Air, under these conditions, is almost an ideal gas, but not quite. As a result the real temperature change will not be exactly zero. With our present knowledge of the thermodynamic properties of air [12] we can calculate that the temperature of the air should drop by about 3 degrees Celsius when the volume is doubled under adiabatic conditions. However, due to the low heat capacity of the air and the high heat capacity of the strong copper containers and the water of the calorimeter, the observed temperature drop is much smaller, so Joule found that the temperature change was zero within his measuring accuracy.
An adiabatic process is a type of thermodynamic process that occurs without transferring heat or mass between the thermodynamic system and its environment. Unlike an isothermal process, an adiabatic process transfers energy to the surroundings only as work. As a key concept in thermodynamics, the adiabatic process supports the theory that explains the first law of thermodynamics. The opposite term to "adiabatic" is diabatic.
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.
Enthalpy is the sum of a thermodynamic system's internal energy and the product of its pressure and volume. It is a state function in thermodynamics used in many measurements in chemical, biological, and physical systems at a constant external pressure, which is conveniently provided by the large ambient atmosphere. The pressure–volume term expresses the work that was done against constant external pressure to establish the system's physical dimensions from to some final volume , i.e. to make room for it by displacing its surroundings. The pressure-volume term is very small for solids and liquids at common conditions, and fairly small for gases. Therefore, enthalpy is a stand-in for energy in chemical systems; bond, lattice, solvation, and other chemical "energies" are actually enthalpy differences. As a state function, enthalpy depends only on the final configuration of internal energy, pressure, and volume, not on the path taken to achieve it.
In thermodynamics, the specific heat capacity of a substance is the amount of heat that must be added to one unit of mass of the substance in order to cause an increase of one unit in temperature. It is also referred to as massic heat capacity or as the specific heat. More formally it is the heat capacity of a sample of the substance divided by the mass of the sample. The SI unit of specific heat capacity is joule per kelvin per kilogram, J⋅kg−1⋅K−1. For example, the heat required to raise the temperature of 1 kg of water by 1 K is 4184 joules, so the specific heat capacity of water is 4184 J⋅kg−1⋅K−1.
The Boltzmann constant is the proportionality factor that relates the average relative thermal energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the gas constant, in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann constant has dimensions of energy divided by temperature, the same as entropy and heat capacity. It is named after the Austrian scientist Ludwig Boltzmann.
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics. The requirement of zero interaction can often be relaxed if, for example, the interaction is perfectly elastic or regarded as point-like collisions.
In electrochemistry, the Nernst equation is a chemical thermodynamical relationship that permits the calculation of the reduction potential of a reaction from the standard electrode potential, absolute temperature, the number of electrons involved in the redox reaction, and activities of the chemical species undergoing reduction and oxidation respectively. It was named after Walther Nernst, a German physical chemist who formulated the equation.
In thermodynamics, the Joule–Thomson effect describes the temperature change of a real gas or liquid when it is expanding; typically caused by the pressure loss from flow through a valve or porous plug while keeping it insulated so that no heat is exchanged with the environment. This procedure is called a throttling process or Joule–Thomson process. The effect is purely an effect due to deviation from ideality, as any ideal gas has no JT effect.
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.
In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work, other than pressure-volume work, that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions. The Gibbs free energy is expressed asWhere:
The internal energy of a thermodynamic system is the energy of the system as a state function, measured as the quantity of energy necessary to bring the system from its standard internal state to its present internal state of interest, accounting for the gains and losses of energy due to changes in its internal state, including such quantities as magnetization. It excludes the kinetic energy of motion of the system as a whole and the potential energy of position of the system as a whole, with respect to its surroundings and external force fields. It includes the thermal energy, i.e., the constituent particles' kinetic energies of motion relative to the motion of the system as a whole. The internal energy of an isolated system cannot change, as expressed in the law of conservation of energy, a foundation of the first law of thermodynamics. The notion has been introduced to describe the systems characterized by temperature variations, temperature being added to the set of state parameters, the position variables known in mechanics, in a similar way to potential energy of the conservative fields of force, gravitational and electrostatic. Internal energy changes equal the algebraic sum of the heat transferred and the work done. In systems without temperature changes, potential energy changes equal the work done by/on the system.
An isentropic process is an idealized thermodynamic process that is both adiabatic and reversible. The work transfers of the system are frictionless, and there is no net transfer of heat or matter. Such an idealized process is useful in engineering as a model of and basis of comparison for real processes. This process is idealized because reversible processes do not occur in reality; thinking of a process as both adiabatic and reversible would show that the initial and final entropies are the same, thus, the reason it is called isentropic. Thermodynamic processes are named based on the effect they would have on the system. Even though in reality it is not necessarily possible to carry out an isentropic process, some may be approximated as such.
An isothermal process is a type of thermodynamic process in which the temperature T of a system remains constant: ΔT = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and a change in the system occurs slowly enough to allow the system to be continuously adjusted to the temperature of the reservoir through heat exchange (see quasi-equilibrium). In contrast, an adiabatic process is where a system exchanges no heat with its surroundings (Q = 0).
In thermodynamics, the fugacity of a real gas is an effective partial pressure which replaces the mechanical partial pressure in an accurate computation of chemical equilibrium. It is equal to the pressure of an ideal gas which has the same temperature and molar Gibbs free energy as the real gas.
Thermal expansion is the tendency of matter to increase in length, area, or volume, changing its size and density, in response to an increase in temperature . Substances usually contract with decreasing temperature, with rare exceptions within limited temperature ranges.
In thermal physics and thermodynamics, the heat capacity ratio, also known as the adiabatic index, the ratio of specific heats, or Laplace's coefficient, is the ratio of the heat capacity at constant pressure to heat capacity at constant volume. It is sometimes also known as the isentropic expansion factor and is denoted by γ (gamma) for an ideal gas or κ (kappa), the isentropic exponent for a real gas. The symbol γ is used by aerospace and chemical engineers. where C is the heat capacity, the molar heat capacity, and c the specific heat capacity of a gas. The suffixes P and V refer to constant-pressure and constant-volume conditions respectively.
In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.
In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.
Entropy production is the amount of entropy which is produced during heat process to evaluate the efficiency of the process.
The majority of good undergraduate textbooks deal with this expansion in great depth; see e.g. Concepts in Thermal Physics, Blundell & Blundell, OUP ISBN 0-19-856770-7