Leggett–Garg inequality

Last updated

The Leggett–Garg inequality, [1] named for Anthony James Leggett and Anupam Garg, is a mathematical inequality fulfilled by all macrorealistic physical theories. Here, macrorealism (macroscopic realism) is a classical worldview defined by the conjunction of two postulates: [1]

Contents

  1. Macrorealism per se: "A macroscopic object, which has available to it two or more macroscopically distinct states, is at any given time in a definite one of those states."
  2. Noninvasive measurability: "It is possible in principle to determine which of these states the system is in without any effect on the state itself, or on the subsequent system dynamics."

In quantum mechanics

In quantum mechanics, the Leggett–Garg inequality is violated, meaning that the time evolution of a system cannot be understood classically. The situation is similar to the violation of Bell's inequalities in Bell test experiments which plays an important role in understanding the nature of the Einstein–Podolsky–Rosen paradox. Here quantum entanglement plays the central role.

Two-state example

The simplest form of the Leggett–Garg inequality derives from examining a system that has only two possible states. These states have corresponding measurement values . The key here is that we have measurements at two different times, and one or more times between the first and last measurement. The simplest example is where the system is measured at three successive times . Now suppose, for instance, that there is a perfect correlation of 1 between times and . That is to say, that for N realisations of the experiment, the temporal correlation reads

We look at this case in some detail. What can be said about what happens at time ? Well, it is possible that , so that if the value of at is , then it is also for both times and . It is also quite possible that , so that the value of at is flipped twice, and so has the same value at as it did at . So, we can have both and anti-correlated as long as we have and anti-correlated. Yet another possibility is that there is no correlation between and . That is we could have . So, although it is known that if at it must also be at , the value at may as well be determined by the toss of a coin. We define as . In these three cases, we have and , respectively.

All that was for 100% correlation between times and . In fact, for any correlation between these times . To see this, we note that

It is easily seen that for every realisation , the term in the parentheses must be less than or equal to unity, so that the result for the average is also less than (or equal to) unity. If we have four distinct times rather than three, we have and so on. These are the Leggett–Garg inequalities. They express the relation between the temporal correlations of and the correlations between successive times in going from the start to the end.

In the derivations above, it has been assumed that the quantity Q, representing the state of the system, always has a definite value (macrorealism per se) and that its measurement at a certain time does not change this value nor its subsequent evolution (noninvasive measurability). A violation of the Leggett–Garg inequality implies that at least one of these two assumptions fails.

Experimental violations

One of the first proposed experiments for demonstrating a violation of macroscopic realism employs superconducting quantum interference devices. There, using Josephson junctions, one should be able to prepare macroscopic superpositions of left and right rotating macroscopically large electronic currents in a superconducting ring. Under sufficient suppression of decoherence one should be able to demonstrate a violation of the Leggett–Garg inequality. [2] However, some criticism has been raised concerning the nature of indistinguishable electrons in a Fermi sea. [3] [4]

A criticism of some other proposed experiments on the Leggett–Garg inequality is that they do not really show a violation of macrorealism because they are essentially about measuring spins of individual particles. [5] In 2015 Robens et al. [6] demonstrated an experimental violation of the Leggett–Garg inequality using superpositions of positions instead of spin with a massive particle. At that time, and so far up until today, the Cesium atoms employed in their experiment represent the largest quantum objects which have been used to experimentally test the Leggett–Garg inequality. [7]

The experiments of Robens et al. [6] as well as Knee et al., [8] using ideal negative measurements, also avoid a second criticism (referred to as “clumsiness loophole” [9] ) that has been directed to previous experiments using measurement protocols that could be interpreted as invasive, thereby conflicting with postulate 2.

Several other experimental violations have been reported, including in 2016 with neutrino particles using the MINOS dataset. [10]

Brukner and Kofler have also demonstrated that quantum violations can be found for arbitrarily large macroscopic systems. As an alternative to quantum decoherence, Brukner and Kofler are proposing a solution of the quantum-to-classical transition in terms of coarse-grained quantum measurements under which usually no violation of the Leggett–Garg inequality can be seen anymore. [11] [12]

Experiments proposed by Mermin [13] and Braunstein and Mann [14] would be better for testing macroscopic realism, but warns that the experiments may be complex enough to admit unforeseen loopholes in the analysis. A detailed discussion of the subject can be found in the review by Emary et al. [15]

The four-term Leggett–Garg inequality can be seen to be similar to the CHSH inequality. Moreover, equalities were proposed by Jaeger et al. [16]

See also

Related Research Articles

EPR paradox Early and influential critique leveled against quantum mechanics

The Einstein–Podolsky–Rosen paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen (EPR), with which they argued that the description of physical reality provided by quantum mechanics was incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing them. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.

Quantum entanglement Correlation between measurements of quantum subsystems, even when spatially separated

Quantum entanglement is a physical phenomenon that occurs when a pair or group of particles is generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the pair or group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics lacking in classical mechanics.

Bell's theorem proves that quantum physics is incompatible with local hidden variable theories. It was introduced by physicist John Stewart Bell in a 1964 paper titled "On the Einstein Podolsky Rosen Paradox", referring to a 1935 thought experiment that Albert Einstein, Boris Podolsky and Nathan Rosen used to argue that quantum physics is an "incomplete" theory. By 1935, it was already recognized that the predictions of quantum physics are probabilistic. Einstein, Podolsky and Rosen presented a scenario that, in their view, indicated that quantum particles, like electrons and photons, must carry physical properties or attributes not included in quantum theory, and the uncertainties in quantum theory's predictions are due to ignorance of these properties, later termed "hidden variables". Their scenario involves a pair of widely separated physical objects, prepared in such a way that the quantum state of the pair is entangled.

The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle's time evolution to be arrested by measuring it frequently enough with respect to some chosen measurement setting.

In Bell test experiments, there may be problems of experimental design or set-up that affect the validity of the experimental findings. These problems are often referred to as "loopholes". See the article on Bell's theorem for the theoretical background to these experimental efforts. The purpose of the experiment is to test whether nature is best described using a local hidden variable theory or by the quantum entanglement theory of quantum mechanics.

A Bell test experiment or Bell's inequality experiment, also simply a Bell test, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. The experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. To date, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.

A Tsirelson bound is an upper limit to quantum mechanical correlations between distant events. Given that quantum mechanics is non-local, i.e., that quantum mechanical correlations violate Bell inequalities, a natural question to ask is "how non-local can quantum mechanics be?", or, more precisely, by how much can the Bell inequality be violated. The answer is precisely the Tsirelson bound for the particular Bell inequality in question. In general, this bound is lower than what would possible without signalling faster than light, and much research has been dedicated to the question of why this is the case.

Quantum metrology is the study of making high-resolution and highly sensitive measurements of physical parameters using quantum theory to describe the physical systems, particularly exploiting quantum entanglement and quantum squeezing. This field promises to develop measurement techniques that give better precision than the same measurement performed in a classical framework. Together with quantum hypothesis testing, it represents an important theoretical model at the basis of quantum sensing.

In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not admit an interpretation in terms of a local realistic theory. Quantum nonlocality has been experimentally verified under different physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore must also be nonlocal in this sense; quantum nonlocality is a property of the universe that is independent of our description of nature.

Objective-collapse theories, also known as models of spontaneous wave function collapse or dynamical reduction models, were formulated as a response to the measurement problem in quantum mechanics, to explain why and how quantum measurements always give definite outcomes, not a superposition of them as predicted by the Schrödinger equation, and more generally how the classical world emerges from quantum theory. The fundamental idea is that the unitary evolution of the wave function describing the state of a quantum system is approximate. It works well for microscopic systems, but progressively loses its validity when the mass / complexity of the system increases.

In quantum information and quantum computing, a cluster state is a type of highly entangled state of multiple qubits. Cluster states are generated in lattices of qubits with Ising type interactions. A cluster C is a connected subset of a d-dimensional lattice, and a cluster state is a pure state of the qubits located on C. They are different from other types of entangled states such as GHZ states or W states in that it is more difficult to eliminate quantum entanglement in the case of cluster states. Another way of thinking of cluster states is as a particular instance of graph states, where the underlying graph is a connected subset of a d-dimensional lattice. Cluster states are especially useful in the context of the one-way quantum computer. For a comprehensible introduction to the topic see.

The Leggett inequalities, named for Anthony James Leggett, who derived them, are a related pair of mathematical expressions concerning the correlations of properties of entangled particles. They are fulfilled by a large class of physical theories based on particular non-local and realistic assumptions, that may be considered to be plausible or intuitive according to common physical reasoning.

High-precision experiments could reveal small previously unseen differences between the behavior of matter and antimatter. This prospect is appealing to physicists because it may show that nature is not Lorentz symmetric.

Anupam Garg is a professor in the department of Physics & Astronomy at Northwestern University, Illinois. He received his Ph.D. in 1983 from Cornell University. In 2012 he became a Fellow of the American Physical Society (APS) thanks to his work on molecular magnetism and macroscopic quantum phenomena.

Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.

Quantum illumination is a paradigm for target detection that employs quantum entanglement between a signal electromagnetic mode and an idler electromagnetic mode, as well as joint measurement of these modes. The signal mode is propagated toward a region of space, and it is either lost or reflected, depending on whether a target is absent or present, respectively. In principle, quantum illumination can be beneficial even if the original entanglement is completely destroyed by a lossy and noisy environment.

The Lieb-Robinson bound is a theoretical upper limit on the speed at which information can propagate in non-relativistic quantum systems. It demonstrates that information cannot travel instantaneously in quantum theory, even when the relativity limits of the speed of light are ignored. The existence of such a finite speed was discovered mathematically by Elliott Lieb and Derek William Robinson in 1972. It turns the locality properties of physical systems into the existence of, and upper bound for this speed. The bound is now known as the Lieb-Robinson bound and the speed is known as the Lieb-Robinson velocity. This velocity is always finite but not universal, depending on the details of the system under consideration. For finite-range, e.g. nearest-neighbor, interactions, this velocity is a constant independent of the distance travelled. In long-range interacting systems, this velocity remains finite, but it can increase with the distance travelled.

Quantum thermodynamics study of quantum-mechanical thermodynamic systems and processes

Quantum thermodynamics is the study of the relations between two independent physical theories: thermodynamics and quantum mechanics. The two independent theories address the physical phenomena of light and matter. In 1905 Albert Einstein argued that the requirement of consistency between thermodynamics and electromagnetism leads to the conclusion that light is quantized obtaining the relation . This paper is the dawn of quantum theory. In a few decades quantum theory became established with an independent set of rules. Currently quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. It differs from quantum statistical mechanics in the emphasis on dynamical processes out of equilibrium. In addition there is a quest for the theory to be relevant for a single individual quantum system.

Spin squeezing is a quantum process that decreases the variance of one of the angular momentum components in an ensemble of particles with a spin. The quantum states obtained are called spin squeezed states. Such states can be used for quantum metrology, as they can provide a better precision for estimating a rotation angle than classical interferometers.

Toshiki Tajima is a Japanese theoretical plasma physicist known for pioneering the laser wakefield acceleration technique with John M. Dawson in 1979. The technique is used to accelerate particles in a plasma was experimentally realized in 1994, for which Tajima received several awards such as the Nishina Memorial Prize (2006), the Enrico Fermi Prize (2015), the Robert R. Wilson Prize (2019), the Hannes Alfvén Prize (2019) and the Charles Hard Townes Award (2020).

References

  1. 1 2 Leggett, A. J.; Garg, Anupam (1985-03-04). "Quantum mechanics versus macroscopic realism: Is the flux there when nobody looks?". Physical Review Letters. 54 (9): 857–860. Bibcode:1985PhRvL..54..857L. doi:10.1103/physrevlett.54.857. ISSN   0031-9007. PMID   10031639.
  2. Leggett, A J (2002-04-05). "Testing the limits of quantum mechanics: motivation, state of play, prospects". Journal of Physics: Condensed Matter. 14 (15): R415–R451. doi:10.1088/0953-8984/14/15/201. ISSN   0953-8984.
  3. Wilde, Mark M.; Mizel, Ari (2012). "Addressing the Clumsiness Loophole in a Leggett-Garg Test of Macrorealism". Foundations of Physics. 42 (2): 256–265. arXiv: 1001.1777 . Bibcode:2012FoPh...42..256W. doi:10.1007/s10701-011-9598-4.
  4. A. Palacios-Laloy (2010). Superconducting qubit in a resonator: test of the Leggett-Garg inequality and single-shot readout (PDF) (PhD).
  5. Foundations and Interpretation of Quantum Mechanics. Gennaro Auletta and Giorgio Parisi, World Scientific, 2001 ISBN   981-02-4614-5, ISBN   978-981-02-4614-3
  6. 1 2 Robens, Carsten; Alt, Wolfgang; Meschede, Dieter; Emary, Clive; Alberti, Andrea (2015-01-20). "Ideal Negative Measurements in Quantum Walks Disprove Theories Based on Classical Trajectories". Physical Review X. 5 (1): 011003. Bibcode:2015PhRvX...5a1003R. doi: 10.1103/physrevx.5.011003 . ISSN   2160-3308.
  7. Knee, George C. (2015). "Viewpoint: Do Quantum Superpositions Have a Size Limit?". Physics. 8 (6). doi: 10.1103/Physics.8.6 .
  8. Knee, George C.; Simmons, Stephanie; Gauger, Erik M.; Morton, John J.L.; Riemann, Helge; et al. (2012). "Violation of a Leggett–Garg inequality with ideal non-invasive measurements". Nature Communications. 3 (1): 606. arXiv: 1104.0238 . Bibcode:2012NatCo...3..606K. doi:10.1038/ncomms1614. ISSN   2041-1723. PMC   3272582 . PMID   22215081.
  9. Wilde, Mark M.; Mizel, Ari (2011-09-13). "Addressing the Clumsiness Loophole in a Leggett-Garg Test of Macrorealism". Foundations of Physics. 42 (2): 256–265. arXiv: 1001.1777 . doi:10.1007/s10701-011-9598-4. ISSN   0015-9018.
  10. Formaggio, J. A.; Kaiser, D. I.; Murskyj, M. M.; Weiss, T. E. (2016-07-26). "Violation of the Leggett-Garg Inequality in Neutrino Oscillations". Physical Review Letters. 117 (5): 050402. arXiv: 1602.00041 . Bibcode:2016PhRvL.117e0402F. doi:10.1103/physrevlett.117.050402. ISSN   0031-9007. PMID   27517759.
  11. Kofler, Johannes; Brukner, Časlav (2007-11-02). "Classical World Arising out of Quantum Physics under the Restriction of Coarse-Grained Measurements". Physical Review Letters. 99 (18): 180403. arXiv: quant-ph/0609079 . Bibcode:2007PhRvL..99r0403K. doi:10.1103/physrevlett.99.180403. ISSN   0031-9007. PMID   17995385.
  12. Kofler, Johannes; Brukner, Časlav (2008-08-28). "Conditions for Quantum Violation of Macroscopic Realism". Physical Review Letters. 101 (9): 090403. arXiv: 0706.0668 . Bibcode:2008PhRvL.101i0403K. doi:10.1103/physrevlett.101.090403. ISSN   0031-9007. PMID   18851590.
  13. Mermin, N. David (1990). "Extreme quantum entanglement in a superposition of macroscopically distinct states". Physical Review Letters. 65 (15): 1838–1840. Bibcode:1990PhRvL..65.1838M. doi:10.1103/physrevlett.65.1838. ISSN   0031-9007. PMID   10042377.
  14. Braunstein, Samuel L.; Mann, A. (1993-04-01). "Noise in Mermin'sn-particle Bell inequality". Physical Review A. 47 (4): R2427–R2430. Bibcode:1993PhRvA..47.2427B. doi:10.1103/physreva.47.r2427. ISSN   1050-2947. PMID   9909338.
  15. Emary, Clive; Lambert, Neill; Nori, Franco (2014). "Leggett–Garg inequalities". Reports on Progress in Physics. 77 (1): 016001. arXiv: 1304.5133 . Bibcode:2014RPPh...77a6001E. doi:10.1088/0034-4885/77/1/016001. ISSN   0034-4885.
  16. Jaeger, Gregg; Viger, Chris; Sarkar, Sahotra (1996). "Bell-type equalities for SQUIDs on the assumptions of macroscopic realism and non-invasive measurability". Physics Letters A. 210 (1–2): 5–10. Bibcode:1996PhLA..210....5J. doi:10.1016/0375-9601(95)00821-7. ISSN   0375-9601.