The eigenstate thermalization hypothesis (or ETH) is a set of ideas which purports to explain when and why an isolated quantum mechanical system can be accurately described using equilibrium statistical mechanics. In particular, it is devoted to understanding how systems which are initially prepared in far-from-equilibrium states can evolve in time to a state which appears to be in thermal equilibrium. The phrase "eigenstate thermalization" was first coined by Mark Srednicki in 1994, [1] after similar ideas had been introduced by Josh Deutsch in 1991. [2] The principal philosophy underlying the eigenstate thermalization hypothesis is that instead of explaining the ergodicity of a thermodynamic system through the mechanism of dynamical chaos, as is done in classical mechanics, one should instead examine the properties of matrix elements of observable quantities in individual energy eigenstates of the system.
In statistical mechanics, the microcanonical ensemble is a particular statistical ensemble which is used to make predictions about the outcomes of experiments performed on isolated systems that are believed to be in equilibrium with an exactly known energy. The microcanonical ensemble is based upon the assumption that, when such an equilibrated system is probed, the probability for it to be found in any of the microscopic states with the same total energy have equal probability. [3] With this assumption, [footnote 1] the ensemble average of an observable quantity is found by averaging the value of that observable over all microstates with the correct total energy: [3]
Importantly, this quantity is independent of everything about the initial state except for its energy.
The assumptions of ergodicity are well-motivated in classical mechanics as a result of dynamical chaos, since a chaotic system will in general spend equal time in equal areas of its phase space. [3] If we prepare an isolated, chaotic, classical system in some region of its phase space, then as the system is allowed to evolve in time, it will sample its entire phase space, subject only to a small number of conservation laws (such as conservation of total energy). If one can justify the claim that a given physical system is ergodic, then this mechanism will provide an explanation for why statistical mechanics is successful in making accurate predictions. For example, the hard sphere gas has been rigorously proven to be ergodic. [3]
This argument cannot be straightforwardly extended to quantum systems, even ones that are analogous to chaotic classical systems, because time evolution of a quantum system does not uniformly sample all vectors in Hilbert space with a given energy. [footnote 2] Given the state at time zero in a basis of energy eigenstates
the expectation value of any observable is
Even if the are incommensurate, so that this expectation value is given for long times by
the expectation value permanently retains knowledge of the initial state in the form of the coefficients .
In principle it is thus an open question as to whether an isolated quantum mechanical system, prepared in an arbitrary initial state, will approach a state which resembles thermal equilibrium, in which a handful of observables are adequate to make successful predictions about the system. However, a variety of experiments in cold atomic gases have indeed observed thermal relaxation in systems which are, to a very good approximation, completely isolated from their environment, and for a wide class of initial states. [4] [5] The task of explaining this experimentally observed applicability of equilibrium statistical mechanics to isolated quantum systems is the primary goal of the eigenstate thermalization hypothesis.
Suppose that we are studying an isolated, quantum mechanical many-body system. In this context, "isolated" refers to the fact that the system has no (or at least negligible) interactions with the environment external to it. If the Hamiltonian of the system is denoted , then a complete set of basis states for the system is given in terms of the eigenstates of the Hamiltonian,
where is the eigenstate of the Hamiltonian with eigenvalue . We will refer to these states simply as "energy eigenstates." For simplicity, we will assume that the system has no degeneracy in its energy eigenvalues, and that it is finite in extent, so that the energy eigenvalues form a discrete, non-degenerate spectrum (this is not an unreasonable assumption, since any "real" laboratory system will tend to have sufficient disorder and strong enough interactions as to eliminate almost all degeneracy from the system, and of course will be finite in size [6] ). This allows us to label the energy eigenstates in order of increasing energy eigenvalue. Additionally, consider some other quantum-mechanical observable , which we wish to make thermal predictions about. The matrix elements of this operator, as expressed in a basis of energy eigenstates, will be denoted by
We now imagine that we prepare our system in an initial state for which the expectation value of is far from its value predicted in a microcanonical ensemble appropriate to the energy scale in question (we assume that our initial state is some superposition of energy eigenstates which are all sufficiently "close" in energy). The eigenstate thermalization hypothesis says that for an arbitrary initial state, the expectation value of will ultimately evolve in time to its value predicted by a microcanonical ensemble, and thereafter will exhibit only small fluctuations around that value, provided that the following two conditions are met: [4]
These conditions can be written as
where and are smooth functions of energy, is the many-body Hilbert space dimension, and is a random variable with zero mean and unit variance. Conversely if a quantum many-body system satisfies the ETH, the matrix representation of any local operator in the energy eigen basis is expected to follow the above ansatz.
We can define a long-time average of the expectation value of the operator according to the expression
If we use the explicit expression for the time evolution of this expectation value, we can write
The integration in this expression can be performed explicitly, and the result is
Each of the terms in the second sum will become smaller as the limit is taken to infinity. Assuming that the phase coherence between the different exponential terms in the second sum does not ever become large enough to rival this decay, the second sum will go to zero, and we find that the long-time average of the expectation value is given by [6]
This prediction for the time-average of the observable is referred to as its predicted value in the diagonal ensemble, [7] The most important aspect of the diagonal ensemble is that it depends explicitly on the initial state of the system, and so would appear to retain all of the information regarding the preparation of the system. In contrast, the predicted value in the microcanonical ensemble is given by the equally-weighted average over all energy eigenstates within some energy window centered around the mean energy of the system [5]
where is the number of states in the appropriate energy window, and the prime on the sum indices indicates that the summation is restricted to this appropriate microcanonical window. This prediction makes absolutely no reference to the initial state of the system, unlike the diagonal ensemble. Because of this, it is not clear why the microcanonical ensemble should provide such an accurate description of the long-time averages of observables in such a wide variety of physical systems.
However, suppose that the matrix elements are effectively constant over the relevant energy window, with fluctuations that are sufficiently small. If this is true, this one constant value A can be effectively pulled out of the sum, and the prediction of the diagonal ensemble is simply equal to this value,
where we have assumed that the initial state is normalized appropriately. Likewise, the prediction of the microcanonical ensemble becomes
The two ensembles are therefore in agreement.
This constancy of the values of over small energy windows is the primary idea underlying the eigenstate thermalization hypothesis. Notice that in particular, it states that the expectation value ofin a single energy eigenstate is equal to the value predicted by a microcanonical ensemble constructed at that energy scale. This constitutes a foundation for quantum statistical mechanics which is radically different from the one built upon the notions of dynamical ergodicity. [1]
Several numerical studies of small lattice systems appear to tentatively confirm the predictions of the eigenstate thermalization hypothesis in interacting systems which would be expected to thermalize. [5] Likewise, systems which are integrable tend not to obey the eigenstate thermalization hypothesis. [5]
Some analytical results can also be obtained if one makes certain assumptions about the nature of highly excited energy eigenstates. The original 1994 paper on the ETH by Mark Srednicki studied, in particular, the example of a quantum hard sphere gas in an insulated box. This is a system which is known to exhibit chaos classically. [1] For states of sufficiently high energy, Berry's conjecture states that energy eigenfunctions in this many-body system of hard sphere particles will appear to behave as superpositions of plane waves, with the plane waves entering the superposition with random phases and Gaussian-distributed amplitudes [1] (the precise notion of this random superposition is clarified in the paper). Under this assumption, one can show that, up to corrections which are negligibly small in the thermodynamic limit, the momentum distribution function for each individual, distinguishable particle is equal to the Maxwell–Boltzmann distribution [1]
where is the particle's momentum, m is the mass of the particles, k is the Boltzmann constant, and the "temperature" is related to the energy of the eigenstate according to the usual equation of state for an ideal gas,
where N is the number of particles in the gas. This result is a specific manifestation of the ETH, in that it results in a prediction for the value of an observable in one energy eigenstate which is in agreement with the prediction derived from a microcanonical (or canonical) ensemble. Note that no averaging over initial states whatsoever has been performed, nor has anything resembling the H-theorem been invoked. Additionally, one can also derive the appropriate Bose–Einstein or Fermi–Dirac distributions, if one imposes the appropriate commutation relations for the particles comprising the gas. [1]
Currently, it is not well understood how high the energy of an eigenstate of the hard sphere gas must be in order for it to obey the ETH. [1] A rough criterion is that the average thermal wavelength of each particle be sufficiently smaller than the radius of the hard sphere particles, so that the system can probe the features which result in chaos classically (namely, the fact that the particles have a finite size [1] ). However, it is conceivable that this condition may be able to be relaxed, and perhaps in the thermodynamic limit, energy eigenstates of arbitrarily low energies will satisfy the ETH (aside from the ground state itself, which is required to have certain special properties, for example, the lack of any nodes [1] ).
Three alternative explanations for the thermalization of isolated quantum systems are often proposed:
The condition that the ETH imposes on the diagonal elements of an observable is responsible for the equality of the predictions of the diagonal and microcanonical ensembles. [6] However, the equality of these long-time averages does not guarantee that the fluctuations in time around this average will be small. That is, the equality of the long-time averages does not ensure that the expectation value of will settle down to this long-time average value, and then stay there for most times.
In order to deduce the conditions necessary for the observable's expectation value to exhibit small temporal fluctuations around its time-average, we study the mean squared amplitude of the temporal fluctuations, defined as [6]
where is a shorthand notation for the expectation value of at time t. This expression can be computed explicitly, and one finds that [6]
Temporal fluctuations about the long-time average will be small so long as the off-diagonal elements satisfy the conditions imposed on them by the ETH, namely that they become exponentially small in the system size. [6] [5] Notice that this condition allows for the possibility of isolated resurgence times , in which the phases align coherently in order to produce large fluctuations away from the long-time average. [4] The amount of time the system spends far away from the long-time average is guaranteed to be small so long as the above mean squared amplitude is sufficiently small. [6] [4] If a system poses a dynamical symmetry, however, it will periodically oscillate around the long-time average. [9]
The expectation value of a quantum mechanical observable represents the average value which would be measured after performing repeated measurements on an ensemble of identically prepared quantum states. Therefore, while we have been examining this expectation value as the principal object of interest, it is not clear to what extent this represents physically relevant quantities. As a result of quantum fluctuations, the expectation value of an observable is not typically what will be measured during one experiment on an isolated system. However, it has been shown that for an observable satisfying the ETH, quantum fluctuations in its expectation value will typically be of the same order of magnitude as the thermal fluctuations which would be predicted in a traditional microcanonical ensemble. [6] [5] This lends further credence to the idea that the ETH is the underlying mechanism responsible for the thermalization of isolated quantum systems.
Currently, there is no known analytical derivation of the eigenstate thermalization hypothesis for general interacting systems. [5] However, it has been verified to be true for a wide variety of interacting systems using numerical exact diagonalization techniques, to within the uncertainty of these methods. [4] [5] It has also been proven to be true in certain special cases in the semi-classical limit, where the validity of the ETH rests on the validity of Shnirelman's theorem, which states that in a system which is classically chaotic, the expectation value of an operator in an energy eigenstate is equal to its classical, microcanonical average at the appropriate energy. [10] Whether or not it can be shown to be true more generally in interacting quantum systems remains an open question. It is also known to explicitly fail in certain integrable systems, in which the presence of a large number of constants of motion prevent thermalization. [4]
It is also important to note that the ETH makes statements about specific observables on a case-by-case basis - it does not make any claims about whether every observable in a system will obey ETH. In fact, this certainly cannot be true. Given a basis of energy eigenstates, one can always explicitly construct an operator which violates the ETH, simply by writing down the operator as a matrix in this basis whose elements explicitly do not obey the conditions imposed by the ETH. Conversely, it is always trivially possible to find operators which do satisfy ETH, by writing down a matrix whose elements are specifically chosen to obey ETH. In light of this, one may be led to believe that the ETH is somewhat trivial in its usefulness. However, the important consideration to bear in mind is that these operators thus constructed may not have any physical relevance. While one can construct these matrices, it is not clear that they correspond to observables which could be realistically measured in an experiment, or bear any resemblance to physically interesting quantities. An arbitrary Hermitian operator on the Hilbert space of the system need not correspond to something which is a physically measurable observable. [11]
Typically, the ETH is postulated to hold for "few-body operators," [4] observables which involve only a small number of particles. Examples of this would include the occupation of a given momentum in a gas of particles, [4] [5] or the occupation of a particular site in a lattice system of particles. [5] Notice that while the ETH is typically applied to "simple" few-body operators such as these, [4] these observables need not be local in space [5] - the momentum number operator in the above example does not represent a local quantity. [5]
There has also been considerable interest in the case where isolated, non-integrable quantum systems fail to thermalize, despite the predictions of conventional statistical mechanics. Disordered systems which exhibit many-body localization are candidates for this type of behavior, with the possibility of excited energy eigenstates whose thermodynamic properties more closely resemble those of ground states. [12] [13] It remains an open question as to whether a completely isolated, non-integrable system without static disorder can ever fail to thermalize. One intriguing possibility is the realization of "Quantum Disentangled Liquids." [14] It also an open question whether all eigenstates must obey the ETH in a thermalizing system.
The eigenstate thermalization hypothesis is closely connected to the quantum nature of chaos (see quantum chaos). Furthermore, since a classically chaotic system is also ergodic, almost all of its trajectories eventually explore uniformly the entire accessible phase space, which would imply the eigenstates of the quantum chaotic system fill the quantum phase space evenly (up to random fluctuations) in the semiclassical limit . In particular, there is a quantum ergodicity theorem showing that the expectation value of an operator converges to the corresponding microcanonical classical average as . However, the quantum ergodicity theorem leaves open the possibility of non-ergodic states such as quantum scars. In addition to the conventional scarring, [15] [16] [17] [18] there are two other types of quantum scarring, which further illustrate the weak-ergodicity breaking in quantum chaotic systems: perturbation-induced [19] [20] [21] [22] [23] and many-body quantum scars. [24] Since the former arise a combined effect of special nearly-degenerate unperturbed states and the localized nature of the perturbation (potential bums), [19] [23] the scarring can slow down the thermalization process in disordered quantum dots and wells, which is further illustrated by the fact that these quantum scars can be used to propagate quantum wave packets in a disordered nanostructure with high fidelity. [20] On the other hand, the latter form of scarring has been speculated [24] [25] to be the culprit behind the unexpectedly slow thermalization of cold atoms observed experimentally. [26]
The quantum harmonic oscillator is the quantum-mechanical analog of the classical harmonic oscillator. Because an arbitrary smooth potential can usually be approximated as a harmonic potential at the vicinity of a stable equilibrium point, it is one of the most important model systems in quantum mechanics. Furthermore, it is one of the few quantum-mechanical systems for which an exact, analytical solution is known.
In physics, specifically statistical mechanics, an ensemble is an idealization consisting of a large number of virtual copies of a system, considered all at once, each of which represents a possible state that the real system might be in. In other words, a statistical ensemble is a set of systems of particles used in statistical mechanics to describe a single system. The concept of an ensemble was introduced by J. Willard Gibbs in 1902.
In quantum mechanics, wave function collapse occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation, and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation. Collapse is a black box for a thermodynamically irreversible interaction with a classical environment.
In physics, specifically in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator, often described as a state that has dynamics most closely resembling the oscillatory behavior of a classical harmonic oscillator. It was the first example of quantum dynamics when Erwin Schrödinger derived it in 1926, while searching for solutions of the Schrödinger equation that satisfy the correspondence principle. The quantum harmonic oscillator arise in the quantum theory of a wide range of physical systems. For instance, a coherent state describes the oscillating motion of a particle confined in a quadratic potential well. The coherent state describes a state in a system for which the ground-state wavepacket is displaced from the origin of the system. This state can be related to classical solutions by a particle oscillating with an amplitude equivalent to the displacement.
The fluctuation–dissipation theorem (FDT) or fluctuation–dissipation relation (FDR) is a powerful tool in statistical physics for predicting the behavior of systems that obey detailed balance. Given that a system obeys detailed balance, the theorem is a proof that thermodynamic fluctuations in a physical variable predict the response quantified by the admittance or impedance of the same physical variable, and vice versa. The fluctuation–dissipation theorem applies both to classical and quantum mechanical systems.
Neutrino oscillation is a quantum mechanical phenomenon in which a neutrino created with a specific lepton family number can later be measured to have a different lepton family number. The probability of measuring a particular flavor for a neutrino varies between three known states, as it propagates through space.
In quantum mechanics, the Hellmann–Feynman theorem relates the derivative of the total energy with respect to a parameter to the expectation value of the derivative of the Hamiltonian with respect to that same parameter. According to the theorem, once the spatial distribution of the electrons has been determined by solving the Schrödinger equation, all the forces in the system can be calculated using classical electrostatics.
In quantum mechanics, an energy level is degenerate if it corresponds to two or more different measurable states of a quantum system. Conversely, two or more different states of a quantum mechanical system are said to be degenerate if they give the same value of energy upon measurement. The number of different states corresponding to a particular energy level is known as the degree of degeneracy of the level. It is represented mathematically by the Hamiltonian for the system having more than one linearly independent eigenstate with the same energy eigenvalue. When this is the case, energy alone is not enough to characterize what state the system is in, and other quantum numbers are needed to characterize the exact state when distinction is desired. In classical mechanics, this can be understood in terms of different possible trajectories corresponding to the same energy.
A quasiprobability distribution is a mathematical object similar to a probability distribution but which relaxes some of Kolmogorov's axioms of probability theory. Quasiprobabilities share several of general features with ordinary probabilities, such as, crucially, the ability to yield expectation values with respect to the weights of the distribution. However, they can violate the σ-additivity axiom: integrating over them does not necessarily yield probabilities of mutually exclusive states. Indeed, quasiprobability distributions also have regions of negative probability density, counterintuitively, contradicting the first axiom. Quasiprobability distributions arise naturally in the study of quantum mechanics when treated in phase space formulation, commonly used in quantum optics, time-frequency analysis, and elsewhere.
In linear algebra, the Schmidt decomposition refers to a particular way of expressing a vector in the tensor product of two inner product spaces. It has numerous applications in quantum information theory, for example in entanglement characterization and in state purification, and plasticity.
The Glauber–Sudarshan P representation is a suggested way of writing down the phase space distribution of a quantum system in the phase space formulation of quantum mechanics. The P representation is the quasiprobability distribution in which observables are expressed in normal order. In quantum optics, this representation, formally equivalent to several other representations, is sometimes preferred over such alternative representations to describe light in optical phase space, because typical optical observables, such as the particle number operator, are naturally expressed in normal order. It is named after George Sudarshan and Roy J. Glauber, who worked on the topic in 1963. Despite many useful applications in laser theory and coherence theory, the Sudarshan–Glauber P representation has the peculiarity that it is not always positive, and is not a bona-fide probability function.
In many-body theory, the term Green's function is sometimes used interchangeably with correlation function, but refers specifically to correlators of field operators or creation and annihilation operators.
The partition function or configuration integral, as used in probability theory, information theory and dynamical systems, is a generalization of the definition of a partition function in statistical mechanics. It is a special case of a normalizing constant in probability theory, for the Boltzmann distribution. The partition function occurs in many problems of probability theory because, in situations where there is a natural symmetry, its associated probability measure, the Gibbs measure, has the Markov property. This means that the partition function occurs not only in physical systems with translation symmetry, but also in such varied settings as neural networks, and applications such as genomics, corpus linguistics and artificial intelligence, which employ Markov networks, and Markov logic networks. The Gibbs measure is also the unique measure that has the property of maximizing the entropy for a fixed expectation value of the energy; this underlies the appearance of the partition function in maximum entropy methods and the algorithms derived therefrom.
In quantum optics, an optical phase space is a phase space in which all quantum states of an optical system are described. Each point in the optical phase space corresponds to a unique state of an optical system. For any such system, a plot of the quadratures against each other, possibly as functions of time, is called a phase diagram. If the quadratures are functions of time then the optical phase diagram can show the evolution of a quantum optical system with time.
In quantum mechanics, given a particular Hamiltonian and an operator with corresponding eigenvalues and eigenvectors given by , the are said to be good quantum numbers if every eigenvector remains an eigenvector of with the same eigenvalue as time evolves.
In quantum chaos, a branch of mathematical physics, quantum ergodicity is a property of the quantization of classical mechanical systems that are chaotic in the sense of exponential sensitivity to initial conditions. Quantum ergodicity states, roughly, that in the high-energy limit, the probability distributions associated to energy eigenstates of a quantized ergodic Hamiltonian tend to a uniform distribution in the classical phase space. This is consistent with the intuition that the flows of ergodic systems are equidistributed in phase space. By contrast, classical completely integrable systems generally have periodic orbits in phase space, and this is exhibited in a variety of ways in the high-energy limit of the eigenstates: typically, some form of concentration occurs in the semiclassical limit .
In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a quantum-mechanical prediction for the system represented by the state. Knowledge of the quantum state, and the quantum mechanical rules for the system's evolution in time, exhausts all that can be known about a quantum system.
This is a glossary for the terminology often encountered in undergraduate quantum mechanics courses.
Exact diagonalization (ED) is a numerical technique used in physics to determine the eigenstates and energy eigenvalues of a quantum Hamiltonian. In this technique, a Hamiltonian for a discrete, finite system is expressed in matrix form and diagonalized using a computer. Exact diagonalization is only feasible for systems with a few tens of particles, due to the exponential growth of the Hilbert space dimension with the size of the quantum system. It is frequently employed to study lattice models, including the Hubbard model, Ising model, Heisenberg model, t-J model, and SYK model.
In quantum computing, the variational quantum eigensolver (VQE) is a quantum algorithm for quantum chemistry, quantum simulations and optimization problems. It is a hybrid algorithm that uses both classical computers and quantum computers to find the ground state of a given physical system. Given a guess or ansatz, the quantum processor calculates the expectation value of the system with respect to an observable, often the Hamiltonian, and a classical optimizer is used to improve the guess. The algorithm is based on the variational method of quantum mechanics.