Thermoelectric effect |
---|
The Seebeck coefficient (also known as thermopower, [1] thermoelectric power, and thermoelectric sensitivity) of a material is a measure of the magnitude of an induced thermoelectric voltage in response to a temperature difference across that material, as induced by the Seebeck effect. [2] The SI unit of the Seebeck coefficient is volts per kelvin (V/K), [2] although it is more often given in microvolts per kelvin (μV/K).
The use of materials with a high Seebeck coefficient [3] is one of many important factors for the efficient behaviour of thermoelectric generators and thermoelectric coolers. More information about high-performance thermoelectric materials can be found in the Thermoelectric materials article. In thermocouples the Seebeck effect is used to measure temperatures, and for accuracy it is desirable to use materials with a Seebeck coefficient that is stable over time.
Physically, the magnitude and sign of the Seebeck coefficient can be approximately understood as being given by the entropy per unit charge carried by electrical currents in the material. It may be positive or negative. In conductors that can be understood in terms of independently moving, nearly-free charge carriers, the Seebeck coefficient is negative for negatively charged carriers (such as electrons), and positive for positively charged carriers (such as electron holes).
One way to define the Seebeck coefficient is the voltage built up when a small temperature gradient is applied to a material, and when the material has come to a steady state where the current density is zero everywhere. If the temperature difference ΔT between the two ends of a material is small, then the Seebeck coefficient of a material is defined as:
where ΔV is the thermoelectric voltage seen at the terminals. (See below for more on the signs of ΔV and ΔT.)
Note that the voltage shift expressed by the Seebeck effect cannot be measured directly, since the measured voltage (by attaching a voltmeter) contains an additional voltage contribution, due to the temperature gradient and Seebeck effect in the measurement leads. The voltmeter voltage is always dependent on relative Seebeck coefficients among the various materials involved.
Most generally and technically, the Seebeck coefficient is defined in terms of the portion of electric current driven by temperature gradients, as in the vector differential equation
where is the current density, is the electrical conductivity, is the voltage gradient, and is the temperature gradient. The zero-current, steady state special case described above has , which implies that the two electrical conductivity terms have cancelled out and so
The sign is made explicit in the following expression:
Thus, if S is positive, the end with the higher temperature has the lower voltage, and vice versa. The voltage gradient in the material will point against the temperature gradient.
The Seebeck effect is generally dominated by the contribution from charge carrier diffusion (see below) which tends to push charge carriers towards the cold side of the material until a compensating voltage has built up. As a result, in p-type semiconductors (which have only positive mobile charges, electron holes), S is positive. Likewise, in n-type semiconductors (which have only negative mobile charges, electrons), S is negative. In most conductors, however, the charge carriers exhibit both hole-like and electron-like behaviour and the sign of S usually depends on which of them predominates.
According to the second Thomson relation (which holds for all non-magnetic materials in the absence of an externally applied magnetic field), the Seebeck coefficient is related to the Peltier coefficient by the exact relation
where is the thermodynamic temperature.
According to the first Thomson relation and under the same assumptions about magnetism, the Seebeck coefficient is related to the Thomson coefficient by
The constant of integration is such that at absolute zero, as required by Nernst's theorem.
In practice the absolute Seebeck coefficient is difficult to measure directly, since the voltage output of a thermoelectric circuit, as measured by a voltmeter, only depends on differences of Seebeck coefficients. This is because electrodes attached to a voltmeter must be placed onto the material in order to measure the thermoelectric voltage. The temperature gradient then also typically induces a thermoelectric voltage across one leg of the measurement electrodes. Therefore, the measured Seebeck coefficient is a contribution from the Seebeck coefficient of the material of interest and the material of the measurement electrodes. This arrangement of two materials is usually called a thermocouple.
The measured Seebeck coefficient is then a contribution from both and can be written as:
Although only relative Seebeck coefficients are important for externally measured voltages, the absolute Seebeck coefficient can be important for other effects where voltage is measured indirectly. Determination of the absolute Seebeck coefficient therefore requires more complicated techniques and is more difficult, but such measurements have been performed on standard materials. These measurements only had to be performed once for all time, and for all materials; for any other material, the absolute Seebeck coefficient can be obtained by performing a relative Seebeck coefficient measurement against a standard material.
A measurement of the Thomson coefficient , which expresses the strength of the Thomson effect, can be used to yield the absolute Seebeck coefficient through the relation: , provided that is measured down to absolute zero. The reason this works is that is expected to decrease to zero as the temperature is brought to zero—a consequence of Nernst's theorem. Such a measurement based on the integration of was published in 1932, [4] though it relied on the interpolation of the Thomson coefficient in certain regions of temperature.
Superconductors have zero Seebeck coefficient, as mentioned below. By making one of the wires in a thermocouple superconducting, it is possible to get a direct measurement of the absolute Seebeck coefficient of the other wire, since it alone determines the measured voltage from the entire thermocouple. A publication in 1958 used this technique to measure the absolute Seebeck coefficient of lead between 7.2 K and 18 K, thereby filling in an important gap in the previous 1932 experiment mentioned above. [5]
The combination of the superconductor-thermocouple technique up to 18 K, with the Thomson-coefficient-integration technique above 18 K, allowed determination of the absolute Seebeck coefficient of lead up to room temperature. By proxy, these measurements led to the determination of absolute Seebeck coefficients for all materials, even up to higher temperatures, by a combination of Thomson coefficient integrations and thermocouple circuits. [6]
The difficulty of these measurements, and the rarity of reproducing experiments, lends some degree of uncertainty to the absolute thermoelectric scale thus obtained. In particular, the 1932 measurements may have incorrectly measured the Thomson coefficient over the range 20 K to 50 K. Since nearly all subsequent publications relied on those measurements, this would mean that all of the commonly used values of absolute Seebeck coefficient (including those shown in the figures) are too low by about 0.3 μV/K, for all temperatures above 50 K. [7]
In the table below are Seebeck coefficients at room temperature for some common, nonexotic materials, measured relative to platinum. [8] The Seebeck coefficient of platinum itself is approximately −5 μV/K at room temperature, [9] and so the values listed below should be compensated accordingly. For example, the Seebeck coefficients of Cu, Ag, Au are 1.5 μV/K, and of Al −1.5 μV/K. The Seebeck coefficient of semiconductors very much depends on doping, with generally positive values for p doped materials and negative values for n doping.
Material | Seebeck coefficient relative to platinum (μV/K) |
---|---|
Selenium | 900 |
Tellurium | 500 |
Silicon | 440 |
Germanium | 330 |
Antimony | 47 |
Nichrome | 25 |
Iron | 19 |
Molybdenum | 10 |
Cadmium, tungsten | 7.5 |
Gold, silver, copper | 6.5 |
Rhodium | 6.0 |
Tantalum | 4.5 |
Lead | 4.0 |
Aluminium | 3.5 |
Carbon | 3.0 |
Mercury | 0.6 |
Platinum | 0 (definition) |
Sodium | -2.0 |
Potassium | -9.0 |
Nickel | -15 |
Constantan | -35 |
Bismuth | -72 |
A material's temperature, crystal structure, and impurities influence the value of thermoelectric coefficients. The Seebeck effect can be attributed to two things: [10] charge-carrier diffusion and phonon drag.
On a fundamental level, an applied voltage difference refers to a difference in the thermodynamic chemical potential of charge carriers, and the direction of the current under a voltage difference is determined by the universal thermodynamic process in which (given equal temperatures) particles flow from high chemical potential to low chemical potential. In other words, the direction of the current in Ohm's law is determined via the thermodynamic arrow of time (the difference in chemical potential could be exploited to produce work, but is instead dissipated as heat which increases entropy). On the other hand, for the Seebeck effect not even the sign of the current can be predicted from thermodynamics, and so to understand the origin of the Seebeck coefficient it is necessary to understand the microscopic physics.
Charge carriers (such as thermally excited electrons) constantly diffuse around inside a conductive material. Due to thermal fluctuations, some of these charge carriers travel with a higher energy than average, and some with a lower energy. When no voltage differences or temperature differences are applied, the carrier diffusion perfectly balances out and so on average one sees no current: . A net current can be generated by applying a voltage difference (Ohm's law), or by applying a temperature difference (Seebeck effect). To understand the microscopic origin of the thermoelectric effect, it is useful to first describe the microscopic mechanism of the normal Ohm's law electrical conductance—to describe what determines the in . Microscopically, what is happening in Ohm's law is that higher energy levels have a higher concentration of carriers per state, on the side with higher chemical potential. For each interval of energy, the carriers tend to diffuse and spread into the area of device where there are fewer carriers per state of that energy. As they move, however, they occasionally scatter dissipatively, which re-randomizes their energy according to the local temperature and chemical potential. This dissipation empties out the carriers from these higher energy states, allowing more to diffuse in. The combination of diffusion and dissipation favours an overall drift of the charge carriers towards the side of the material where they have a lower chemical potential. [11] : Ch.11
For the thermoelectric effect, now, consider the case of uniform voltage (uniform chemical potential) with a temperature gradient. In this case, at the hotter side of the material there is more variation in the energies of the charge carriers, compared to the colder side. This means that high energy levels have a higher carrier occupation per state on the hotter side, but also the hotter side has a lower occupation per state at lower energy levels. As before, the high-energy carriers diffuse away from the hot end, and produce entropy by drifting towards the cold end of the device. However, there is a competing process: at the same time low-energy carriers are drawn back towards the hot end of the device. Though these processes both generate entropy, they work against each other in terms of charge current, and so a net current only occurs if one of these drifts is stronger than the other. The net current is given by , where (as shown below) the thermoelectric coefficient depends literally on how conductive high-energy carriers are, compared to low-energy carriers. The distinction may be due to a difference in rate of scattering, a difference in speeds, a difference in density of states, or a combination of these effects.
The processes described above apply in materials where each charge carrier sees an essentially static environment so that its motion can be described independently from other carriers, and independent of other dynamics (such as phonons). In particular, in electronic materials with weak electron-electron interactions, weak electron-phonon interactions, etc. it can be shown in general that the linear response conductance is
and the linear response thermoelectric coefficient is
where is the energy-dependent conductivity, and is the Fermi–Dirac distribution function. These equations are known as the Mott relations, of Sir Nevill Francis Mott. [12] The derivative
is a function peaked around the chemical potential (Fermi level) with a width of approximately . The energy-dependent conductivity (a quantity that cannot actually be directly measured — one only measures ) is calculated as where is the electron diffusion constant and is the electronic density of states (in general, both are functions of energy).
In materials with strong interactions, none of the above equations can be used since it is not possible to consider each charge carrier as a separate entity. The Wiedemann–Franz law can also be exactly derived using the non-interacting electron picture, and so in materials where the Wiedemann–Franz law fails (such as superconductors), the Mott relations also generally tend to fail. [13]
The formulae above can be simplified in a couple of important limiting cases:
In semimetals and metals, where transport only occurs near the Fermi level and changes slowly in the range , one can perform a Sommerfeld expansion , which leads to
This expression is sometimes called "the Mott formula", however it is much less general than Mott's original formula expressed above.
In the free electron model with scattering, the value of is of order , where is the Fermi temperature, and so a typical value of the Seebeck coefficient in the Fermi gas is (the prefactor varies somewhat depending on details such as dimensionality and scattering). In highly conductive metals the Fermi temperatures are typically around 104 – 105 K, and so it is understandable why their absolute Seebeck coefficients are only of order 1 – 10 μV/K at room temperature. Note that whereas the free electron model predicts a negative Seebeck coefficient, real metals actually have complicated band structures and may exhibit positive Seebeck coefficients (examples: Cu, Ag, Au).
The fraction in semimetals is sometimes calculated from the measured derivative of with respect to some energy shift induced by field effect. This is not necessarily correct and the estimate of can be incorrect (by a factor of two or more), since the disorder potential depends on screening which also changes with field effect. [14]
In semiconductors at low levels of doping, transport only occurs far away from the Fermi level. At low doping in the conduction band (where , where is the minimum energy of the conduction band edge), one has . Approximating the conduction band levels' conductivity function as for some constants and ,
whereas in the valence band when and ,
The values of and depend on material details; in bulk semiconductor these constants range between 1 and 3, the extremes corresponding to acoustic-mode lattice scattering and ionized-impurity scattering. [15]
In extrinsic (doped) semiconductors either the conduction or valence band will dominate transport, and so one of the numbers above will give the measured values. In general however the semiconductor may also be intrinsic in which case the bands conduct in parallel, and so the measured values will be
This results in a crossover behaviour, as shown in the figure. The highest Seebeck coefficient is obtained when the semiconductor is lightly doped, however a high Seebeck coefficient is not necessarily useful on its own. For thermoelectric power devices (coolers, generators) it is more important to maximize the thermoelectric power factor , [16] or the thermoelectric figure of merit, and the optimum generally occurs at high doping levels. [17]
Phonons are not always in local thermal equilibrium; they move against the thermal gradient. They lose momentum by interacting with electrons (or other carriers) and imperfections in the crystal. If the phonon-electron interaction is predominant, the phonons will tend to push the electrons to one end of the material, hence losing momentum and contributing to the thermoelectric field. This contribution is most important in the temperature region where phonon-electron scattering is predominant. This happens for
where is the Debye temperature. At lower temperatures there are fewer phonons available for drag, and at higher temperatures they tend to lose momentum in phonon-phonon scattering instead of phonon-electron scattering. At lower temperatures, material boundaries also play an increasing role as the phonons can travel significant distances. [18] Practically speaking, phonon drag is an important effect in semiconductors near room temperature (even though well above ), that is comparable in magnitude to the carrier-diffusion effect described in the previous section. [18]
This region of the thermopower-versus-temperature function is highly variable under a magnetic field.[ citation needed ]
The Seebeck coefficient of a material corresponds thermodynamically to the amount of entropy "dragged along" by the flow of charge inside a material; it is in some sense the entropy per unit charge in the material. [19]
The Hall effect is the production of a potential difference across an electrical conductor that is transverse to an electric current in the conductor and to an applied magnetic field perpendicular to the current. It was discovered by Edwin Hall in 1879.
A thermocouple, also known as a "thermoelectrical thermometer", is an electrical device consisting of two dissimilar electrical conductors forming an electrical junction. A thermocouple produces a temperature-dependent voltage as a result of the Seebeck effect, and this voltage can be interpreted to measure temperature. Thermocouples are widely used as temperature sensors.
Electrical resistivity is a fundamental specific property of a material that measures its electrical resistance or how strongly it resists electric current. A low resistivity indicates a material that readily allows electric current. Resistivity is commonly represented by the Greek letter ρ (rho). The SI unit of electrical resistivity is the ohm-metre (Ω⋅m). For example, if a 1 m3 solid cube of material has sheet contacts on two opposite faces, and the resistance between these contacts is 1 Ω, then the resistivity of the material is 1 Ω⋅m.
In solid state physics, a particle's effective mass is the mass that it seems to have when responding to forces, or the mass that it seems to have when interacting with other identical particles in a thermal distribution. One of the results from the band theory of solids is that the movement of particles in a periodic potential, over long distances larger than the lattice spacing, can be very different from their motion in a vacuum. The effective mass is a quantity that is used to simplify band structures by modeling the behavior of a free particle with that mass. For some purposes and some materials, the effective mass can be considered to be a simple constant of a material. In general, however, the value of effective mass depends on the purpose for which it is used, and can vary depending on a number of factors.
Fermi–Dirac statistics is a type of quantum statistics that applies to the physics of a system consisting of many non-interacting, identical particles that obey the Pauli exclusion principle. A result is the Fermi–Dirac distribution of particles over energy states. It is named after Enrico Fermi and Paul Dirac, each of whom derived the distribution independently in 1926. Fermi–Dirac statistics is a part of the field of statistical mechanics and uses the principles of quantum mechanics.
In physics, mean free path is the average distance over which a moving particle travels before substantially changing its direction or energy, typically as a result of one or more successive collisions with other particles.
The thermoelectric effect is the direct conversion of temperature differences to electric voltage and vice versa via a thermocouple. A thermoelectric device creates a voltage when there is a different temperature on each side. Conversely, when a voltage is applied to it, heat is transferred from one side to the other, creating a temperature difference.
Thermoelectric materials show the thermoelectric effect in a strong or convenient form.
In solid-state physics, the electron mobility characterises how quickly an electron can move through a metal or semiconductor when pushed or pulled by an electric field. There is an analogous quantity for holes, called hole mobility. The term carrier mobility refers in general to both electron and hole mobility.
In probability theory and statistics, the coefficient of variation (CV), also known as normalized root-mean-square deviation (NRMSD), percent RMS, and relative standard deviation (RSD), is a standardized measure of dispersion of a probability distribution or frequency distribution. It is defined as the ratio of the standard deviation to the mean , and often expressed as a percentage ("%RSD"). The CV or RSD is widely used in analytical chemistry to express the precision and repeatability of an assay. It is also commonly used in fields such as engineering or physics when doing quality assurance studies and ANOVA gauge R&R, by economists and investors in economic models, and in psychology/neuroscience.
In solid-state physics, the free electron model is a quantum mechanical model for the behaviour of charge carriers in a metallic solid. It was developed in 1927, principally by Arnold Sommerfeld, who combined the classical Drude model with quantum mechanical Fermi–Dirac statistics and hence it is also known as the Drude–Sommerfeld model.
In physics, the Wiedemann–Franz law states that the ratio of the electronic contribution of the thermal conductivity (κ) to the electrical conductivity (σ) of a metal is proportional to the temperature (T).
In solid-state physics, the thermal Hall effect, also known as the Righi–Leduc effect, named after independent co-discoverers Augusto Righi and Sylvestre Anatole Leduc, is the thermal analog of the Hall effect. Given a thermal gradient across a solid, this effect describes the appearance of an orthogonal temperature gradient when a magnetic field is applied.
The word electricity refers generally to the movement of electrons, or other charge carriers, through a conductor in the presence of a potential difference or an electric field. The speed of this flow has multiple meanings. In everyday electrical and electronic devices, the signals travel as electromagnetic waves typically at 50%–99% of the speed of light in vacuum. The electrons themselves move much more slowly. See drift velocity and electron mobility.
Viscoplasticity is a theory in continuum mechanics that describes the rate-dependent inelastic behavior of solids. Rate-dependence in this context means that the deformation of the material depends on the rate at which loads are applied. The inelastic behavior that is the subject of viscoplasticity is plastic deformation which means that the material undergoes unrecoverable deformations when a load level is reached. Rate-dependent plasticity is important for transient plasticity calculations. The main difference between rate-independent plastic and viscoplastic material models is that the latter exhibit not only permanent deformations after the application of loads but continue to undergo a creep flow as a function of time under the influence of the applied load.
Heat transfer physics describes the kinetics of energy storage, transport, and energy transformation by principal energy carriers: phonons, electrons, fluid particles, and photons. Heat is thermal energy stored in temperature-dependent motion of particles including electrons, atomic nuclei, individual atoms, and molecules. Heat is transferred to and from matter by the principal energy carriers. The state of energy stored within matter, or transported by the carriers, is described by a combination of classical and quantum statistical mechanics. The energy is different made (converted) among various carriers. The heat transfer processes are governed by the rates at which various related physical phenomena occur, such as the rate of particle collisions in classical mechanics. These various states and kinetics determine the heat transfer, i.e., the net rate of energy storage or transport. Governing these process from the atomic level to macroscale are the laws of thermodynamics, including conservation of energy.
Electric dipole spin resonance (EDSR) is a method to control the magnetic moments inside a material using quantum mechanical effects like the spin–orbit interaction. Mainly, EDSR allows to flip the orientation of the magnetic moments through the use of electromagnetic radiation at resonant frequencies. EDSR was first proposed by Emmanuel Rashba.
In solid state physics the electronic specific heat, sometimes called the electron heat capacity, is the specific heat of an electron gas. Heat is transported by phonons and by free electrons in solids. For pure metals, however, the electronic contributions dominate in the thermal conductivity. In impure metals, the electron mean free path is reduced by collisions with impurities, and the phonon contribution may be comparable with the electronic contribution.
Charge transport mechanisms are theoretical models that aim to quantitatively describe the electric current flow through a given medium.
In quantum computing, Mølmer–Sørensen gate scheme refers to an implementation procedure for various multi-qubit quantum logic gates used mostly in trapped ion quantum computing. This procedure is based on the original proposition by Klaus Mølmer and Anders Sørensen in 1999-2000.