Negative temperature

Last updated

SI temperature/coldness conversion scale: Temperatures on the Kelvin scale are shown in blue (Celsius scale in green, Fahrenheit scale in red), coldness values in gigabyte per nanojoule are shown in black. Infinite temperature (coldness zero) is shown at the top of the diagram; positive values of coldness/temperature are on the right-hand side, negative values on the left-hand side. ColdnessScale.svg
SI temperature/coldness conversion scale: Temperatures on the Kelvin scale are shown in blue (Celsius scale in green, Fahrenheit scale in red), coldness values in gigabyte per nanojoule are shown in black. Infinite temperature (coldness zero) is shown at the top of the diagram; positive values of coldness/temperature are on the right-hand side, negative values on the left-hand side.

Certain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero.

Contents

The absolute temperature (Kelvin) scale can be understood loosely as a measure of average kinetic energy. Usually, system temperatures are positive. However, in particular isolated systems, the temperature defined in terms of Boltzmann's entropy can become negative.

The possibility of negative temperatures was first predicted by Lars Onsager in 1949. [1] Onsager was investigating 2D vortices confined within a finite area, and realized that since their positions are not independent degrees of freedom from their momenta, the resulting phase space must also be bounded by the finite area. Bounded phase space is the essential property that allows for negative temperatures, and can occur in both classical and quantum systems. As shown by Onsager, a system with bounded phase space necessarily has a peak in the entropy as energy is increased. For energies exceeding the value where the peak occurs, the entropy decreases as energy increases, and high-energy states necessarily have negative Boltzmann temperature.

A system with a truly negative temperature on the Kelvin scale is hotter than any system with a positive temperature. If a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system. [2] [3] A standard example of such a system is population inversion in laser physics.

Temperature is loosely interpreted as the average kinetic energy of the system's particles. The existence of negative temperature, let alone negative temperature representing "hotter" systems than positive temperature, would seem paradoxical in this interpretation. The paradox is resolved by considering the more rigorous definition of thermodynamic temperature as the tradeoff between internal energy and entropy contained in the system, with "coldness", the reciprocal of temperature, being the more fundamental quantity. Systems with a positive temperature will increase in entropy as one adds energy to the system, while systems with a negative temperature will decrease in entropy as one adds energy to the system. [4]

Thermodynamic systems with unbounded phase space cannot achieve negative temperatures: adding heat always increases their entropy. The possibility of a decrease in entropy as energy increases requires the system to "saturate" in entropy. This is only possible if the number of high energy states is limited. For a system of ordinary (quantum or classical) particles such as atoms or dust, the number of high energy states is unlimited (particle momenta can in principle be increased indefinitely). Some systems, however (see the examples below), have a maximum amount of energy that they can hold, and as they approach that maximum energy their entropy actually begins to decrease. [5] The limited range of states accessible to a system with negative temperature means that negative temperature is associated with emergent ordering of the system at high energies. For example in Onsager's point-vortex analysis negative temperature is associated with the emergence of large-scale clusters of vortices. [1] This spontaneous ordering in equilibrium statistical mechanics goes against common physical intuition that increased energy leads to increased disorder.

Definition of temperature

The definition of thermodynamic temperature T is a function of the change in the system's entropy S under reversible heat transfer Qrev:

Entropy being a state function, the integral of dS over any cyclical process is zero. For a system in which the entropy is purely a function of the system's energy E, the temperature can be defined as:

Equivalently, thermodynamic beta, or "coldness", is defined as

where k is the Boltzmann constant.

Note that in classical thermodynamics, S is defined in terms of temperature. This is reversed here, S is the statistical entropy, a function of the possible microstates of the system, and temperature conveys information on the distribution of energy levels among the possible microstates. For systems with many degrees of freedom, the statistical and thermodynamic definitions of entropy are generally consistent with each other.

Some theorists have proposed using an alternative definition of entropy as a way to resolve perceived inconsistencies between statistical and thermodynamic entropy for small systems and systems where the number of states decreases with energy, and the temperatures derived from these entropies are different. [6] [7] It has been argued that the new definition would create other inconsistencies; [8] its proponents have argued that this is only apparent. [7]

Heat and molecular energy distribution

When the temperature is negative, higher energy states are more likely to be occupied than low energy ones.

Negative temperatures can only exist in a system where there are a limited number of energy states (see below). As the temperature is increased on such a system, particles move into higher and higher energy states, and as the temperature increases, the number of particles in the lower energy states and in the higher energy states approaches equality. (This is a consequence of the definition of temperature in statistical mechanics for systems with limited states.) By injecting energy into these systems in the right fashion, it is possible to create a system in which there are more particles in the higher energy states than in the lower ones. The system can then be characterised as having a negative temperature.

A substance with a negative temperature is not colder than absolute zero, but rather it is hotter than infinite temperature. As Kittel and Kroemer (p. 462) put it,

The temperature scale from cold to hot runs:

:+0 K (−273.15 °C), …, +100 K (−173.15 °C), …, +300 K (+26.85 °C), …, +1000 K (+726.85 °C), …, +∞ K (+∞ °C), −∞ K (−∞ °C), …, −1000 K (−1273.15 °C), …, −300 K (−573.15 °C), …, −100 K (−373.15 °C), …, −0 K (−273.15 °C).

The corresponding inverse temperature scale, for the quantity β = 1/kT (where k is the Boltzmann constant), runs continuously from low energy to high as +∞, …, 0, …, −∞. Because it avoids the abrupt jump from +∞ to −∞, β is considered more natural than T. Although a system can have multiple negative temperature regions and thus have −∞ to +∞ discontinuities.

In many familiar physical systems, temperature is associated to the kinetic energy of atoms. Since there is no upper bound on the momentum of an atom, there is no upper bound to the number of energy states available when more energy is added, and therefore no way to get to a negative temperature. However, in statistical mechanics, temperature can correspond to other degrees of freedom than just kinetic energy (see below).

Temperature and disorder

The distribution of energy among the various translational, vibrational, rotational, electronic, and nuclear modes of a system determines the macroscopic temperature. In a "normal" system, thermal energy is constantly being exchanged between the various modes.

However, in some situations, it is possible to isolate one or more of the modes. In practice, the isolated modes still exchange energy with the other modes, but the time scale of this exchange is much slower than for the exchanges within the isolated mode. One example is the case of nuclear spins in a strong external magnetic field. In this case, energy flows fairly rapidly among the spin states of interacting atoms, but energy transfer between the nuclear spins and other modes is relatively slow. Since the energy flow is predominantly within the spin system, it makes sense to think of a spin temperature that is distinct from the temperature associated to other modes.

A definition of temperature can be based on the relationship:

The relationship suggests that a positive temperature corresponds to the condition where entropy, S, increases as thermal energy, qrev, is added to the system. This is the "normal" condition in the macroscopic world, and is always the case for the translational, vibrational, rotational, and non-spin-related electronic and nuclear modes. The reason for this is that there are an infinite number of these types of modes, and adding more heat to the system increases the number of modes that are energetically accessible, and thus increases the entropy.

Examples

Noninteracting two-level particles

Entropy vs E two state.svg
Beta vs E two state.svg
Temperature vs E two state.svg
Entropy, thermodynamic beta, and temperature as a function of the energy for a system of N noninteracting two-level particles.

The simplest example, albeit a rather nonphysical one, is to consider a system of N particles, each of which can take an energy of either +ε or ε but are otherwise noninteracting. This can be understood as a limit of the Ising model in which the interaction term becomes negligible. The total energy of the system is

where σi is the sign of the ith particle and j is the number of particles with positive energy minus the number of particles with negative energy. From elementary combinatorics, the total number of microstates with this amount of energy is a binomial coefficient:

By the fundamental assumption of statistical mechanics, the entropy of this microcanonical ensemble is

We can solve for thermodynamic beta (β = 1/kBT) by considering it as a central difference without taking the continuum limit:

hence the temperature

This entire proof assumes the microcanonical ensemble with energy fixed and temperature being the emergent property. In the canonical ensemble, the temperature is fixed and energy is the emergent property. This leads to (ε refers to microstates):

Following the previous example, we choose a state with two levels and two particles. This leads to microstates ε1 = 0, ε2 = 1, ε3 = 1, and ε4 = 2.

The resulting values for S, E, and Z all increase with T and never need to enter a negative temperature regime.

Nuclear spins

The previous example is approximately realized by a system of nuclear spins in an external magnetic field. [9] [10] This allows the experiment to be run as a variation of nuclear magnetic resonance spectroscopy. In the case of electronic and nuclear spin systems, there are only a finite number of modes available, often just two, corresponding to spin up and spin down. In the absence of a magnetic field, these spin states are degenerate, meaning that they correspond to the same energy. When an external magnetic field is applied, the energy levels are split, since those spin states that are aligned with the magnetic field will have a different energy from those that are anti-parallel to it.

In the absence of a magnetic field, such a two-spin system would have maximum entropy when half the atoms are in the spin-up state and half are in the spin-down state, and so one would expect to find the system with close to an equal distribution of spins. Upon application of a magnetic field, some of the atoms will tend to align so as to minimize the energy of the system, thus slightly more atoms should be in the lower-energy state (for the purposes of this example we will assume the spin-down state is the lower-energy state). It is possible to add energy to the spin system using radio frequency techniques. [11] This causes atoms to flip from spin-down to spin-up.

Since we started with over half the atoms in the spin-down state, this initially drives the system towards a 50/50 mixture, so the entropy is increasing, corresponding to a positive temperature. However, at some point, more than half of the spins are in the spin-up position. [12] In this case, adding additional energy reduces the entropy, since it moves the system further from a 50/50 mixture. This reduction in entropy with the addition of energy corresponds to a negative temperature. [13] In NMR spectroscopy, this corresponds to pulses with a pulse width of over 180° (for a given spin). While relaxation is fast in solids, it can take several seconds in solutions and even longer in gases and in ultracold systems; several hours were reported for silver and rhodium at picokelvin temperatures. [13] It is still important to understand that the temperature is negative only with respect to nuclear spins. Other degrees of freedom, such as molecular vibrational, electronic and electron spin levels are at a positive temperature, so the object still has positive sensible heat. Relaxation actually happens by exchange of energy between the nuclear spin states and other states (e.g. through the nuclear Overhauser effect with other spins).

Lasers

This phenomenon can also be observed in many lasing systems, wherein a large fraction of the system's atoms (for chemical and gas lasers) or electrons (in semiconductor lasers) are in excited states. This is referred to as a population inversion.

The Hamiltonian for a single mode of a luminescent radiation field at frequency ν is

The density operator in the grand canonical ensemble is

For the system to have a ground state, the trace to converge, and the density operator to be generally meaningful, βH must be positive semidefinite. So if < μ, and H is negative semidefinite, then β must itself be negative, implying a negative temperature. [14]

Motional degrees of freedom

Negative temperatures have also been achieved in motional degrees of freedom. Using an optical lattice, upper bounds were placed on the kinetic energy, interaction energy and potential energy of cold potassium-39 atoms. This was done by tuning the interactions of the atoms from repulsive to attractive using a Feshbach resonance and changing the overall harmonic potential from trapping to anti-trapping, thus transforming the Bose-Hubbard Hamiltonian from Ĥ → −Ĥ. Performing this transformation adiabatically while keeping the atoms in the Mott insulator regime, it is possible to go from a low entropy positive temperature state to a low entropy negative temperature state. In the negative temperature state, the atoms macroscopically occupy the maximum momentum state of the lattice. The negative temperature ensembles equilibrated and showed long lifetimes in an anti-trapping harmonic potential. [15]

Two-dimensional vortex motion

The two-dimensional systems of vortices confined to a finite area can form thermal equilibrium states at negative temperature, [16] [17] and indeed negative temperature states were first predicted by Onsager in his analysis of classical point vortices. [18] Onsager's prediction was confirmed experimentally for a system of quantum vortices in a Bose-Einstein condensate in 2019. [19] [20]

See also

Related Research Articles

<span class="mw-page-title-main">Boltzmann distribution</span> Probability distribution of energy states of a system

In statistical mechanics and mathematics, a Boltzmann distribution is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:

<span class="mw-page-title-main">Fermi–Dirac statistics</span> Statistical description for the behavior of fermions

Fermi–Dirac statistics is a type of quantum statistics that applies to the physics of a system consisting of many non-interacting, identical particles that obey the Pauli exclusion principle. A result is the Fermi–Dirac distribution of particles over energy states. It is named after Enrico Fermi and Paul Dirac, each of whom derived the distribution independently in 1926. Fermi–Dirac statistics is a part of the field of statistical mechanics and uses the principles of quantum mechanics.

<span class="mw-page-title-main">Maxwell–Boltzmann statistics</span> Statistical distribution used in many-particle mechanics

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.

<span class="mw-page-title-main">Fermi gas</span> Physical model of gases composed of many non-interacting identical fermions

An ideal Fermi gas is a state of matter which is an ensemble of many non-interacting fermions. Fermions are particles that obey Fermi–Dirac statistics, like electrons, protons, and neutrons, and, in general, particles with half-integer spin. These statistics determine the energy distribution of fermions in a Fermi gas in thermal equilibrium, and is characterized by their number density, temperature, and the set of available energy states. The model is named after the Italian physicist Enrico Fermi.

<span class="mw-page-title-main">Bose–Einstein statistics</span> Description of the behavior of bosons

In quantum statistics, Bose–Einstein statistics describes one of two possible ways in which a collection of non-interacting, indistinguishable particles may occupy a set of available discrete energy states at thermodynamic equilibrium. The aggregation of particles in the same state, which is a characteristic of particles obeying Bose–Einstein statistics, accounts for the cohesive streaming of laser light and the frictionless creeping of superfluid helium. The theory of this behaviour was developed (1924–25) by Satyendra Nath Bose, who recognized that a collection of identical and indistinguishable particles can be distributed in this way. The idea was later adopted and extended by Albert Einstein in collaboration with Bose.

<span class="mw-page-title-main">Third law of thermodynamics</span> Law of physics

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero the system must be in a state with the minimum possible energy.

<span class="mw-page-title-main">Helmholtz free energy</span> Thermodynamic potential

In thermodynamics, the Helmholtz free energy is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature (isothermal). The change in the Helmholtz energy during a process is equal to the maximum amount of work that the system can perform in a thermodynamic process in which temperature is held constant. At constant temperature, the Helmholtz free energy is minimized at equilibrium.

The Ising model, named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent magnetic dipole moments of atomic "spins" that can be in one of two states. The spins are arranged in a graph, usually a lattice, allowing each spin to interact with its neighbors. Neighboring spins that agree have a lower energy than those that disagree; the system tends to the lowest energy but heat disturbs this tendency, thus creating the possibility of different structural phases. The model allows the identification of phase transitions as a simplified model of reality. The two-dimensional square-lattice Ising model is one of the simplest statistical models to show a phase transition.

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

<span class="mw-page-title-main">Onsager reciprocal relations</span> Relations between flows and forces, or gradients, in thermodynamic systems

In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists.

The classical XY model is a lattice model of statistical mechanics. In general, the XY model can be seen as a specialization of Stanley's n-vector model for n = 2.

<span class="mw-page-title-main">Bose gas</span> State of matter of many bosons

An ideal Bose gas is a quantum-mechanical phase of matter, analogous to a classical ideal gas. It is composed of bosons, which have an integer value of spin, and abide by Bose–Einstein statistics. The statistical mechanics of bosons were developed by Satyendra Nath Bose for a photon gas, and extended to massive particles by Albert Einstein who realized that an ideal gas of bosons would form a condensate at a low enough temperature, unlike a classical ideal gas. This condensate is known as a Bose–Einstein condensate.

The Einstein solid is a model of a crystalline solid that contains a large number of independent three-dimensional quantum harmonic oscillators of the same frequency. The independence assumption is relaxed in the Debye model.

<span class="mw-page-title-main">Thermodynamic beta</span> Measure of the coldness of a system

In statistical thermodynamics, thermodynamic beta, also known as coldness, is the reciprocal of the thermodynamic temperature of a system:

<span class="mw-page-title-main">Conjugate variables (thermodynamics)</span> Pair of values which express a thermodynamic systems internal energy

In thermodynamics, the internal energy of a system is expressed in terms of pairs of conjugate variables such as temperature and entropy, pressure and volume, or chemical potential and particle number. In fact, all thermodynamic potentials are expressed in terms of conjugate pairs. The product of two quantities that are conjugate has units of energy or sometimes power.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.

<span class="mw-page-title-main">Schottky anomaly</span>

The Schottky anomaly is an effect observed in solid-state physics where the specific heat capacity of a solid at low temperature has a peak. It is called anomalous because the heat capacity usually increases with temperature, or stays constant. It occurs in systems with a limited number of energy levels so that E(T) increases with sharp steps, one for each energy level that becomes available. Since Cv =(dE/dT), it will experience a large peak as the temperature crosses over from one step to the next.

Algorithmic cooling is an algorithmic method for transferring heat from some qubits to others or outside the system and into the environment, which results in a cooling effect. This method uses regular quantum operations on ensembles of qubits, and it can be shown that it can succeed beyond Shannon's bound on data compression. The phenomenon is a result of the connection between thermodynamics and information theory.

<span class="mw-page-title-main">Bianconi–Barabási model</span>

The Bianconi–Barabási model is a model in network science that explains the growth of complex evolving networks. This model can explain that nodes with different characteristics acquire links at different rates. It predicts that a node's growth depends on its fitness and can calculate the degree distribution. The Bianconi–Barabási model is named after its inventors Ginestra Bianconi and Albert-László Barabási. This model is a variant of the Barabási–Albert model. The model can be mapped to a Bose gas and this mapping can predict a topological phase transition between a "rich-get-richer" phase and a "winner-takes-all" phase.

In statistical mechanics, Lee–Yang theory, sometimes also known as Yang–Lee theory, is a scientific theory which seeks to describe phase transitions in large physical systems in the thermodynamic limit based on the properties of small, finite-size systems. The theory revolves around the complex zeros of partition functions of finite-size systems and how these may reveal the existence of phase transitions in the thermodynamic limit.

References

  1. 1 2 Onsager, L. (1949). "Statistical Hydrodynamics". Il Nuovo Cimento. 6 (2): 279–287. Bibcode:1949NCim....6S.279O. doi:10.1007/BF02780991. ISSN   1827-6121. S2CID   186224016.
  2. Ramsey, Norman (1956-07-01). "Thermodynamics and Statistical Mechanics at Negative Absolute Temperatures". Physical Review. 103 (1): 20–28. Bibcode:1956PhRv..103...20R. doi:10.1103/PhysRev.103.20.
  3. Tremblay, André-Marie (1975-11-18). "Comment on: Negative Kelvin temperatures: some anomalies and a speculation" (PDF). American Journal of Physics. 44 (10): 994–995. Bibcode:1976AmJPh..44..994T. doi:10.1119/1.10248.
  4. Atkins, Peter W. (2010-03-25). The Laws of Thermodynamics: A Very Short Introduction. Oxford University Press. pp. 10–14. ISBN   978-0-19-957219-9. OCLC   467748903.
  5. Atkins, Peter W. (2010-03-25). The Laws of Thermodynamics: A Very Short Introduction. Oxford University Press. pp. 89–95. ISBN   978-0-19-957219-9. OCLC   467748903.
  6. Dunkel, Jorn; Hilbert, Stefan (2013). "Consistent thermostatistics forbids negative absolute temperatures". Nature Physics. 10 (1): 67. arXiv: 1304.2066 . Bibcode:2014NatPh..10...67D. doi:10.1038/nphys2815. S2CID   16757018.
  7. 1 2 Hanggi, Peter; Hilbert, Stefan; Dunkel, Jorn (2016). "Meaning of temperature in different thermostatistical ensembles". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences. 374 (2064): 20150039. arXiv: 1507.05713 . Bibcode:2016RSPTA.37450039H. doi:10.1098/rsta.2015.0039. PMID   26903095. S2CID   39161351.
  8. Frenkel, Daan; Warren, Patrick B. (2015-02-01). "Gibbs, Boltzmann, and negative temperatures". American Journal of Physics. 83 (2): 163–170. arXiv: 1403.4299 . Bibcode:2015AmJPh..83..163F. doi:10.1119/1.4895828. ISSN   0002-9505. S2CID   119179342.
  9. Purcell, E. M.; Pound, R. V. (1951-01-15). "A Nuclear Spin System at Negative Temperature". Physical Review. 81 (2): 279–280. Bibcode:1951PhRv...81..279P. doi:10.1103/PhysRev.81.279.
  10. Varga, Peter (1998). "Minimax games, spin glasses, and the polynomial-time hierarchy of complexity classes". Physical Review E. 57 (6): 6487–6492. arXiv: cond-mat/9604030 . Bibcode:1998PhRvE..57.6487V. CiteSeerX   10.1.1.306.470 . doi:10.1103/PhysRevE.57.6487. S2CID   10964509.
  11. Ramsey, Norman F. (1998). Spectroscopy with coherent radiation: selected papers of Norman F. Ramsey with commentary. World Scientific series in 20th century physics, v. 21. Singapore; River Edge, N.J.: World Scientific. p. 417. ISBN   9789810232504. OCLC   38753008.
  12. Levitt, Malcolm H. (2008). Spin Dynamics: Basics of Nuclear Magnetic Resonance. West Sussex, England: John Wiley & Sons Ltd. p. 273. ISBN   978-0-470-51117-6.
  13. 1 2 "Positive and negative picokelvin temperatures".
  14. Hsu, W.; Barakat, R. (1992). "Statistics and thermodynamics of luminescent radiation". Physical Review B . 46 (11): 6760–6767. Bibcode:1992PhRvB..46.6760H. doi:10.1103/PhysRevB.46.6760. PMID   10002377.
  15. Braun, S.; Ronzheimer, J. P.; Schreiber, M.; Hodgman, S. S.; Rom, T.; Bloch, I.; Schneider, U. (2013). "Negative Absolute Temperature for Motional Degrees of Freedom". Science. 339 (6115): 52–55. arXiv: 1211.0545 . Bibcode:2013Sci...339...52B. doi:10.1126/science.1227831. PMID   23288533. S2CID   8207974.
  16. Montgomery, D. C. (1972). "Two-dimensional vortex motion and "negative temperatures"". Physics Letters A . 39 (1): 7–8. Bibcode:1972PhLA...39....7M. doi:10.1016/0375-9601(72)90302-7.
  17. Edwards, S. F.; Taylor, J. B. (1974). "Negative Temperature States of Two-Dimensional Plasmas and Vortex Fluids". Proceedings of the Royal Society of London A . 336 (1606): 257–271. Bibcode:1974RSPSA.336..257E. doi:10.1098/rspa.1974.0018. JSTOR   78450. S2CID   120771020.
  18. Onsager, L. (1949-03-01). "Statistical hydrodynamics". Il Nuovo Cimento (1943-1954). 6 (2): 279–287. Bibcode:1949NCim....6S.279O. doi:10.1007/BF02780991. ISSN   1827-6121. S2CID   186224016 . Retrieved 2019-11-17.
  19. Gauthier, G.; Reeves, M. T.; Yu, X.; Bradley, A. S.; Baker, M. A.; Bell, T. A.; Rubinsztein-Dunlop, H.; Davis, M. J.; Neely, T. W. (2019). "Giant vortex clusters in a two-dimensional quantum fluid". Science. 364 (6447): 1264–1267. arXiv: 1801.06951 . Bibcode:2019Sci...364.1264G. doi:10.1126/science.aat5718. PMID   31249054. S2CID   195750381.
  20. Johnstone, S. P.; Groszek, A. J.; Starkey, P. T.; Billinton, C. J.; Simula, T. P.; Helmerson, K. (2019). "Evolution of large-scale flow from turbulence in a two-dimensional superfluid". Science. 365 (6447): 1267–1271. arXiv: 1801.06952 . Bibcode:2019Sci...364.1267J. doi:10.1126/science.aat5793. PMID   31249055. S2CID   4948239.

Further reading