|Part of a series on|
The Leggett–Garg inequality,named for Anthony James Leggett and Anupam Garg, is a mathematical inequality fulfilled by all macrorealistic physical theories. Here, macrorealism (macroscopic realism) is a classical worldview defined by the conjunction of two postulates:
In quantum mechanics, the Leggett–Garg inequality is violated, meaning that the time evolution of a system cannot be understood classically. The situation is similar to the violation of Bell's inequalities in Bell test experiments which plays an important role in understanding the nature of the Einstein–Podolsky–Rosen paradox. Here quantum entanglement plays the central role.
The simplest form of the Leggett–Garg inequality derives from examining a system that has only two possible states. These states have corresponding measurement values . The key here is that we have measurements at two different times, and one or more times between the first and last measurement. The simplest example is where the system is measured at three successive times . Now suppose, for instance, that there is a perfect correlation of 1 between times and . That is to say, that for N realisations of the experiment, the temporal correlation reads
We look at this case in some detail. What can be said about what happens at time ? Well, it is possible that , so that if the value of at is , then it is also for both times and . It is also quite possible that , so that the value of at is flipped twice, and so has the same value at as it did at . So, we can have both and anti-correlated as long as we have and anti-correlated. Yet another possibility is that there is no correlation between and . That is we could have . So, although it is known that if at it must also be at , the value at may as well be determined by the toss of a coin. We define as . In these three cases, we have and , respectively.
All that was for 100% correlation between times and . In fact, for any correlation between these times . To see this, we note that
It is easily seen that for every realisation , the term in the parentheses must be less than or equal to unity, so that the result for the average is also less than (or equal to) unity. If we have four distinct times rather than three, we have and so on. These are the Leggett–Garg inequalities. They express the relation between the temporal correlations of and the correlations between successive times in going from the start to the end.
In the derivations above, it has been assumed that the quantity Q, representing the state of the system, always has a definite value (macrorealism per se) and that its measurement at a certain time does not change this value nor its subsequent evolution (noninvasive measurability). A violation of the Leggett–Garg inequality implies that at least one of these two assumptions fails.
One of the first proposed experiments for demonstrating a violation of macroscopic realism employs superconducting quantum interference devices. There, using Josephson junctions, one should be able to prepare macroscopic superpositions of left and right rotating macroscopically large electronic currents in a superconducting ring. Under sufficient suppression of decoherence one should be able to demonstrate a violation of the Leggett–Garg inequality.However, some criticism has been raised concerning the nature of indistinguishable electrons in a Fermi sea.
A criticism of some other proposed experiments on the Leggett–Garg inequality is that they do not really show a violation of macrorealism because they are essentially about measuring spins of individual particles.In 2015 Robens et al. demonstrated an experimental violation of the Leggett–Garg inequality using superpositions of positions instead of spin with a massive particle. At that time, and so far up until today, the Cesium atoms employed in their experiment represent the largest quantum objects which have been used to experimentally test the Leggett–Garg inequality.
The experiments of Robens et al.as well as Knee et al., using ideal negative measurements, also avoid a second criticism (referred to as “clumsiness loophole” ) that has been directed to previous experiments using measurement protocols that could be interpreted as invasive, thereby conflicting with postulate 2.
Several other experimental violations have been reported, including in 2016 with neutrino particles using the MINOS dataset.
Brukner and Kofler have also demonstrated that quantum violations can be found for arbitrarily large macroscopic systems. As an alternative to quantum decoherence, Brukner and Kofler are proposing a solution of the quantum-to-classical transition in terms of coarse-grained quantum measurements under which usually no violation of the Leggett–Garg inequality can be seen anymore.
Experiments proposed by Merminand Braunstein and Mann would be better for testing macroscopic realism, but warns that the experiments may be complex enough to admit unforeseen loopholes in the analysis. A detailed discussion of the subject can be found in the review by Emary et al.
The four-term Leggett–Garg inequality can be seen to be similar to the CHSH inequality. Moreover, equalities were proposed by Jaeger et al.
The Einstein–Podolsky–Rosen paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen (EPR), with which they argued that the description of physical reality provided by quantum mechanics was incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing them. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.
Quantum entanglement is a physical phenomenon that occurs when a pair or group of particles is generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the pair or group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics lacking in classical mechanics.
Bell's theorem proves that quantum physics is incompatible with local hidden variable theories. It was introduced by physicist John Stewart Bell in a 1964 paper titled "On the Einstein Podolsky Rosen Paradox", referring to a 1935 thought experiment that Albert Einstein, Boris Podolsky and Nathan Rosen used to argue that quantum physics is an "incomplete" theory. By 1935, it was already recognized that the predictions of quantum physics are probabilistic. Einstein, Podolsky and Rosen presented a scenario that, in their view, indicated that quantum particles, like electrons and photons, must carry physical properties or attributes not included in quantum theory, and the uncertainties in quantum theory's predictions are due to ignorance of these properties, later termed "hidden variables". Their scenario involves a pair of widely separated physical objects, prepared in such a way that the quantum state of the pair is entangled.
The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle's time evolution to be arrested by measuring it frequently enough with respect to some chosen measurement setting.
In Bell test experiments, there may be problems of experimental design or set-up that affect the validity of the experimental findings. These problems are often referred to as "loopholes". See the article on Bell's theorem for the theoretical background to these experimental efforts. The purpose of the experiment is to test whether nature is best described using a local hidden variable theory or by the quantum entanglement theory of quantum mechanics.
A Bell test experiment or Bell's inequality experiment, also simply a Bell test, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. The experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. To date, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.
A Tsirelson bound is an upper limit to quantum mechanical correlations between distant events. Given that quantum mechanics is non-local, i.e., that quantum mechanical correlations violate Bell inequalities, a natural question to ask is "how non-local can quantum mechanics be?", or, more precisely, by how much can the Bell inequality be violated. The answer is precisely the Tsirelson bound for the particular Bell inequality in question. In general, this bound is lower than what would possible without signalling faster than light, and much research has been dedicated to the question of why this is the case.
Quantum metrology is the study of making high-resolution and highly sensitive measurements of physical parameters using quantum theory to describe the physical systems, particularly exploiting quantum entanglement and quantum squeezing. This field promises to develop measurement techniques that give better precision than the same measurement performed in a classical framework. Together with quantum hypothesis testing, it represents an important theoretical model at the basis of quantum sensing.
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not admit an interpretation in terms of a local realistic theory. Quantum nonlocality has been experimentally verified under different physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore must also be nonlocal in this sense; quantum nonlocality is a property of the universe that is independent of our description of nature.
Objective-collapse theories, also known as models of spontaneous wave function collapse or dynamical reduction models, were formulated as a response to the measurement problem in quantum mechanics, to explain why and how quantum measurements always give definite outcomes, not a superposition of them as predicted by the Schrödinger equation, and more generally how the classical world emerges from quantum theory. The fundamental idea is that the unitary evolution of the wave function describing the state of a quantum system is approximate. It works well for microscopic systems, but progressively loses its validity when the mass / complexity of the system increases.
In quantum information and quantum computing, a cluster state is a type of highly entangled state of multiple qubits. Cluster states are generated in lattices of qubits with Ising type interactions. A cluster C is a connected subset of a d-dimensional lattice, and a cluster state is a pure state of the qubits located on C. They are different from other types of entangled states such as GHZ states or W states in that it is more difficult to eliminate quantum entanglement in the case of cluster states. Another way of thinking of cluster states is as a particular instance of graph states, where the underlying graph is a connected subset of a d-dimensional lattice. Cluster states are especially useful in the context of the one-way quantum computer. For a comprehensible introduction to the topic see.
The Leggett inequalities, named for Anthony James Leggett, who derived them, are a related pair of mathematical expressions concerning the correlations of properties of entangled particles. They are fulfilled by a large class of physical theories based on particular non-local and realistic assumptions, that may be considered to be plausible or intuitive according to common physical reasoning.
High-precision experiments could reveal small previously unseen differences between the behavior of matter and antimatter. This prospect is appealing to physicists because it may show that nature is not Lorentz symmetric.
Anupam Garg is a professor in the department of Physics & Astronomy at Northwestern University, Illinois. He received his Ph.D. in 1983 from Cornell University. In 2012 he became a Fellow of the American Physical Society (APS) thanks to his work on molecular magnetism and macroscopic quantum phenomena.
Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
Quantum illumination is a paradigm for target detection that employs quantum entanglement between a signal electromagnetic mode and an idler electromagnetic mode, as well as joint measurement of these modes. The signal mode is propagated toward a region of space, and it is either lost or reflected, depending on whether a target is absent or present, respectively. In principle, quantum illumination can be beneficial even if the original entanglement is completely destroyed by a lossy and noisy environment.
The Lieb-Robinson bound is a theoretical upper limit on the speed at which information can propagate in non-relativistic quantum systems. It demonstrates that information cannot travel instantaneously in quantum theory, even when the relativity limits of the speed of light are ignored. The existence of such a finite speed was discovered mathematically by Elliott Lieb and Derek William Robinson in 1972. It turns the locality properties of physical systems into the existence of, and upper bound for this speed. The bound is now known as the Lieb-Robinson bound and the speed is known as the Lieb-Robinson velocity. This velocity is always finite but not universal, depending on the details of the system under consideration. For finite-range, e.g. nearest-neighbor, interactions, this velocity is a constant independent of the distance travelled. In long-range interacting systems, this velocity remains finite, but it can increase with the distance travelled.
Quantum thermodynamics is the study of the relations between two independent physical theories: thermodynamics and quantum mechanics. The two independent theories address the physical phenomena of light and matter. In 1905 Albert Einstein argued that the requirement of consistency between thermodynamics and electromagnetism leads to the conclusion that light is quantized obtaining the relation . This paper is the dawn of quantum theory. In a few decades quantum theory became established with an independent set of rules. Currently quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. It differs from quantum statistical mechanics in the emphasis on dynamical processes out of equilibrium. In addition there is a quest for the theory to be relevant for a single individual quantum system.
Spin squeezing is a quantum process that decreases the variance of one of the angular momentum components in an ensemble of particles with a spin. The quantum states obtained are called spin squeezed states. Such states can be used for quantum metrology, as they can provide a better precision for estimating a rotation angle than classical interferometers.
Toshiki Tajima is a Japanese theoretical plasma physicist known for pioneering the laser wakefield acceleration technique with John M. Dawson in 1979. The technique is used to accelerate particles in a plasma was experimentally realized in 1994, for which Tajima received several awards such as the Nishina Memorial Prize (2006), the Enrico Fermi Prize (2015), the Robert R. Wilson Prize (2019), the Hannes Alfvén Prize (2019) and the Charles Hard Townes Award (2020).