In thermodynamics, the enthalpy of mixing (also heat of mixing and excess enthalpy) is the enthalpy liberated or absorbed from a substance upon mixing. [1] When a substance or compound is combined with any other substance or compound, the enthalpy of mixing is the consequence of the new interactions between the two substances or compounds. [1] This enthalpy, if released exothermically, can in an extreme case cause an explosion.
Enthalpy of mixing can often be ignored in calculations for mixtures where other heat terms exist, or in cases where the mixture is ideal. [2] The sign convention is the same as for enthalpy of reaction: when the enthalpy of mixing is positive, mixing is endothermic, while negative enthalpy of mixing signifies exothermic mixing. In ideal mixtures, the enthalpy of mixing is null. In non-ideal mixtures, the thermodynamic activity of each component is different from its concentration by multiplying with the activity coefficient.
One approximation for calculating the heat of mixing is Flory–Huggins solution theory for polymer solutions.
For a liquid, enthalpy of mixing can be defined as follows [2]
Where:
Enthalpy of mixing can also be defined using Gibbs free energy of mixing
However, Gibbs free energy of mixing and entropy of mixing tend to be more difficult to determine experimentally. [3] As such, enthalpy of mixing tends to be determined experimentally in order to calculate entropy of mixing, rather than the reverse.
Enthalpy of mixing is defined exclusively for the continuum regime, which excludes molecular-scale effects (However, first-principles calculations have been made for some metal-alloy systems such as Al-Co-Cr [4] or β-Ti [5] ).
When two substances are mixed the resulting enthalpy is not an addition of the pure component enthalpies, unless the substances form an ideal mixture. [6] The interactions between each set of molecules determines the final change in enthalpy. For example, when compound “x” has a strong attractive interaction with compound “y” the resulting enthalpy is exothermic. [6] In the case of alcohol and its interactions with a hydrocarbon, the alcohol molecule participates in hydrogen bonding with other alcohol molecules, and these hydrogen bonding interactions are much stronger than alcohol-hydrocarbon interactions, which results in an endothermic heat of mixing. [7]
Enthalpy of mixing is often calculated experimentally using calorimetry methods. A bomb calorimeter is created to be an isolated system with an insulated frame and a reaction chamber, and is used to transfer the heat of mixing into surrounding water for which the temperature is measured. A typical solution would use the equation (derived from the definition above) in conjunction experimentally determined total-mixture enthalpies and tabulated pure species enthalpies, the difference being equal to enthalpy of mixing.
More complex models, such as the Flory-Huggins and UNIFAC models, allow prediction of enthalpies of mixing. Flory-Huggins is useful in calculating enthalpies of mixing for polymeric mixtures and considers a system from a multiplicity perspective.
Calculations of organic enthalpies of mixing can be made by modifying UNIFAC using the equations [8]
Where:
It can be seen that prediction of enthalpy of mixing is incredibly complex and requires a plethora of system variables to be known. This explains why enthalpy of mixing is typically experimentally determined.
The excess Gibbs free energy of mixing can be related to the enthalpy of mixing by the use of the Gibbs-Helmholtz equation:
or equivalently
In these equations, the excess and total enthalpies of mixing are equal because the ideal enthalpy of mixing is zero. This is not true for the corresponding Gibbs free energies however.
An ideal mixture is any in which the arithmetic mean (with respect to mole fraction) of the two pure substances is the same as that of the final mixture. Among other important thermodynamic simplifications, this means that enthalpy of mixing is zero: . Any gas that follows the ideal gas law can be assumed to mix ideally, as can hydrocarbons and liquids with similar molecular interactions and properties. [2]
A regular solution or mixture has a non-zero enthalpy of mixing with an ideal entropy of mixing. [9] [10] Under this assumption, scales linearly with , and is equivalent to the excess internal energy. [11]
The enthalpy of mixing for a ternary mixture can be expressed in terms of the enthalpies of mixing of the corresponding binary mixtures:
Where:
This method requires that the interactions between two species are unaffected by the addition of the third species. is then evaluated for a binary concentration ratio equal to the concentration ratio of species i to j in the ternary mixture (). [12]
Intermolecular forces are the main constituent of changes in the enthalpy of a mixture. Stronger attractive forces between the mixed molecules, such as hydrogen-bonding, induced-dipole, and dipole-dipole interactions result in a lower enthalpy of the mixture and a release of heat. [6] If strong interactions only exist between like-molecules, such as H-bonds between water in a water-hexane solution, the mixture will have a higher total enthalpy and absorb heat.
Enthalpy is the sum of a thermodynamic system's internal energy and the product of its pressure and volume. It is a state function in thermodynamics used in many measurements in chemical, biological, and physical systems at a constant external pressure, which is conveniently provided by the large ambient atmosphere. The pressure–volume term expresses the work that was done against constant external pressure to establish the system's physical dimensions from to some final volume , i.e. to make room for it by displacing its surroundings. The pressure-volume term is very small for solids and liquids at common conditions, and fairly small for gases. Therefore, enthalpy is a stand-in for energy in chemical systems; bond, lattice, solvation, and other chemical "energies" are actually enthalpy differences. As a state function, enthalpy depends only on the final configuration of internal energy, pressure, and volume, not on the path taken to achieve it.
In mathematics and physics, Laplace's equation is a second-order partial differential equation named after Pierre-Simon Laplace, who first studied its properties. This is often written as or where is the Laplace operator, is the divergence operator, is the gradient operator, and is a twice-differentiable real-valued function. The Laplace operator therefore maps a scalar function to another scalar function.
In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum amount of work, other than pressure–volume work, that may be performed by a thermodynamically closed system at constant temperature and pressure. It also provides a necessary condition for processes such as chemical reactions that may occur under these conditions. The Gibbs free energy is expressed as where:
In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements.
In estimation theory and statistics, the Cramér–Rao bound (CRB) relates to estimation of a deterministic parameter. The result is named in honor of Harald Cramér and Calyampudi Radhakrishna Rao, but has also been derived independently by Maurice Fréchet, Georges Darmois, and by Alexander Aitken and Harold Silverstone. It is also known as Fréchet-Cramér–Rao or Fréchet-Darmois-Cramér-Rao lower bound. It states that the precision of any unbiased estimator is at most the Fisher information; or (equivalently) the reciprocal of the Fisher information is a lower bound on its variance.
An ideal solution or ideal mixture is a solution that exhibits thermodynamic properties analogous to those of a mixture of ideal gases. The enthalpy of mixing is zero as is the volume change on mixing by definition; the closer to zero the enthalpy of mixing is, the more "ideal" the behavior of the solution becomes. The vapor pressures of the solvent and solute obey Raoult's law and Henry's law, respectively, and the activity coefficient is equal to one for each component.
The classical XY model is a lattice model of statistical mechanics. In general, the XY model can be seen as a specialization of Stanley's n-vector model for n = 2.
In theoretical physics, the Wess–Zumino model has become the first known example of an interacting four-dimensional quantum field theory with linearly realised supersymmetry. In 1974, Julius Wess and Bruno Zumino studied, using modern terminology, dynamics of a single chiral superfield whose cubic superpotential leads to a renormalizable theory. It is a special case of 4D N = 1 global supersymmetry.
Flory–Huggins solution theory is a lattice model of the thermodynamics of polymer solutions which takes account of the great dissimilarity in molecular sizes in adapting the usual expression for the entropy of mixing. The result is an equation for the Gibbs free energy change for mixing a polymer with a solvent. Although it makes simplifying assumptions, it generates useful results for interpreting experiments.
In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.
Photon polarization is the quantum mechanical description of the classical polarized sinusoidal plane electromagnetic wave. An individual photon can be described as having right or left circular polarization, or a superposition of the two. Equivalently, a photon can be described as having horizontal or vertical linear polarization, or a superposition of the two.
The Gross–Pitaevskii equation describes the ground state of a quantum system of identical bosons using the Hartree–Fock approximation and the pseudopotential interaction model.
Thermodynamic databases contain information about thermodynamic properties for substances, the most important being enthalpy, entropy, and Gibbs free energy. Numerical values of these thermodynamic properties are collected as tables or are calculated from thermodynamic datafiles. Data is expressed as temperature-dependent values for one mole of substance at the standard pressure of 101.325 kPa, or 100 kPa. Both of these definitions for the standard condition for pressure are in use.
The Langmuir adsorption model explains adsorption by assuming an adsorbate behaves as an ideal gas at isothermal conditions. According to the model, adsorption and desorption are reversible processes. This model even explains the effect of pressure; i.e., at these conditions the adsorbate's partial pressure is related to its volume V adsorbed onto a solid adsorbent. The adsorbent, as indicated in the figure, is assumed to be an ideal solid surface composed of a series of distinct sites capable of binding the adsorbate. The adsorbate binding is treated as a chemical reaction between the adsorbate gaseous molecule and an empty sorption site S. This reaction yields an adsorbed species with an associated equilibrium constant :
The kicked rotator, also spelled as kicked rotor, is a paradigmatic model for both Hamiltonian chaos and quantum chaos. It describes a free rotating stick in an inhomogeneous "gravitation like" field that is periodically switched on in short pulses. The model is described by the Hamiltonian
In quantum computing, the quantum phase estimation algorithm is a quantum algorithm to estimate the phase corresponding to an eigenvalue of a given unitary operator. Because the eigenvalues of a unitary operator always have unit modulus, they are characterized by their phase, and therefore the algorithm can be equivalently described as retrieving either the phase or the eigenvalue itself. The algorithm was initially introduced by Alexei Kitaev in 1995.
In chemical thermodynamics, excess properties are properties of mixtures which quantify the non-ideal behavior of real mixtures. They are defined as the difference between the value of the property in a real mixture and the value that would exist in an ideal solution under the same conditions. The most frequently used excess properties are the excess volume, excess enthalpy, and excess chemical potential. The excess volume, internal energy, and enthalpy are identical to the corresponding mixing properties; that is,
Partial-wave analysis, in the context of quantum mechanics, refers to a technique for solving scattering problems by decomposing each wave into its constituent angular-momentum components and solving using boundary conditions.
Symmetries in quantum mechanics describe features of spacetime and particles which are unchanged under some transformation, in the context of quantum mechanics, relativistic quantum mechanics and quantum field theory, and with applications in the mathematical formulation of the standard model and condensed matter physics. In general, symmetry in physics, invariance, and conservation laws, are fundamentally important constraints for formulating physical theories and models. In practice, they are powerful methods for solving problems and predicting what can happen. While conservation laws do not always give the answer to the problem directly, they form the correct constraints and the first steps to solving a multitude of problems. In application, understanding symmetries can also provide insights on the eigenstates that can be expected. For example, the existence of degenerate states can be inferred by the presence of non commuting symmetry operators or that the non degenerate states are also eigenvectors of symmetry operators.
Pure inductive logic (PIL) is the area of mathematical logic concerned with the philosophical and mathematical foundations of probabilistic inductive reasoning. It combines classical predicate logic and probability theory. Probability values are assigned to sentences of a first-order relational language to represent degrees of belief that should be held by a rational agent. Conditional probability values represent degrees of belief based on the assumption of some received evidence.
{{cite web}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)