# Quantum foundations

Last updated

Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.

## Contents

There exist different approaches to resolve this conceptual gap:

• First, one can put quantum physics in contraposition with classical physics: by identifying scenarios, such as Bell experiments, where quantum theory radically deviates from classical predictions, one hopes to gain physical insights on the structure of quantum physics.
• Second, one can attempt to find a re-derivation of the quantum formalism in terms of operational axioms.
• Third, one can search for a full correspondence between the mathematical elements of the quantum framework and physical phenomena: any such correspondence is called an interpretation.
• Fourth, one can renounce quantum theory altogether and propose a different model of the world.

Research in quantum foundations is structured along these roads.

## Non-classical features of quantum theory

### Quantum nonlocality

Two or more separate parties conducting measurements over a quantum state can observe correlations which cannot be explained with any local hidden variable theory. [1] [2] Whether this should be regarded as proving that the physical world itself is "nonlocal" is a topic of debate, [3] [4] but the terminology of "quantum nonlocality" is commonplace. Nonlocality research efforts in quantum foundations focus on determining the exact limits that classical or quantum physics enforces on the correlations observed in a Bell experiment or more complex causal scenarios. [5] This research program has so far provided a generalization of Bell’s theorem that allows falsifying all classical theories with a superluminal, yet finite, hidden influence. [6]

### Quantum contextuality

Nonlocality can be understood as an instance of quantum contextuality. A situation is contextual when the value of an observable depends on the context in which it is measured (namely, on which other observables are being measured as well). The original definition of measurement contextuality can be extended to state preparations and even general physical transformations. [7]

### Epistemic models for the quantum wave-function

A physical property is epistemic when it represents our knowledge or beliefs on the value of a second, more fundamental feature. The probability of an event to occur is an example of an epistemic property. In contrast, a non-epistemic or ontic variable captures the notion of a “real” property of the system under consideration.

There is an on-going debate on whether the wave-function represents the epistemic state of a yet to be discovered ontic variable or, on the contrary, it is a fundamental entity. [8] Under some physical assumptions, the Pusey–Barrett–Rudolph (PBR) theorem demonstrates the inconsistency of quantum states as epistemic states, in the sense above. [9] Note that, in QBism [10] and Copenhagen-type [11] views, quantum states are still regarded as epistemic, not with respect to some ontic variable, but to one’s expectations about future experimental outcomes. The PBR theorem does not exclude such epistemic views on quantum states.

## Axiomatic reconstructions

Some of the counter-intuitive aspects of quantum theory, as well as the difficulty to extend it, follow from the fact that its defining axioms lack a physical motivation. An active area of research in quantum foundations is therefore to find alternative formulations of quantum theory which rely on physically compelling principles. Those efforts come in two flavors, depending on the desired level of description of the theory: the so called Generalized Probabilistic Theories approach and the Black boxes approach.

### The framework of Generalized Probabilistic Theories

Generalized Probabilistic Theories (GPTs) are a general framework to describe the operational features of arbitrary physical theories. Essentially, they provide a statistical description of any experiment combining state preparations, transformations and measurements. The framework of GPTs can accommodate classical and quantum physics, as well as hypothetical non-quantum physical theories which nonetheless possess quantum theory’s most remarkable features, such as entanglement or teleportation. [12] Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory. [13]

L. Hardy introduced the concept of GPT in 2001, in an attempt to re-derive quantum theory from basic physical principles. [13] Although Hardy’s work was very influential (see the follow-ups below), one of his axioms was regarded as unsatisfactory: it stipulated that, of all the physical theories compatible with the rest of the axioms, one should choose the simplest one. [14] The work of Dakic and Brukner eliminated this “axiom of simplicity” and provided a reconstruction of quantum theory based on three physical principles. [14] This was followed by the more rigorous reconstruction of Masanes and Müller. [15]

Axioms common to these three reconstructions are:

• The subspace axiom: systems which can store the same amount of information are physically equivalent.
• Local tomography: to characterize the state of a composite system it is enough to conduct measurements at each part.
• Reversibility: for any two extremal states [i.e., states which are not statistical mixtures of other states], there exists a reversible physical transformation that maps one into the other.

An alternative GPT reconstruction proposed by Chiribella et al. [16] [17] around the same time is also based on the

• Purification axiom: for any state ${\displaystyle S_{A}}$ of a physical system A there exists a bipartite physical system ${\displaystyle A-B}$ and an extremal state (or purification) ${\displaystyle T_{AB}}$ such that ${\displaystyle S_{A}}$ is the restriction of ${\displaystyle T_{AB}}$ to system ${\displaystyle A}$. In addition, any two such purifications ${\displaystyle T_{AB},T_{AB}^{\prime }}$ of ${\displaystyle S_{A}}$ can be mapped into one another via a reversible physical transformation on system ${\displaystyle B}$.

The use of purification to characterize quantum theory has been criticized on the grounds that it also applies in the Spekkens toy model. [18]

To the success of the GPT approach, it can be countered that all such works just recover finite dimensional quantum theory. In addition, none of the previous axioms can be experimentally falsified unless the measurement apparatuses are assumed to be tomographically complete.

### The framework of black boxes

In the black box or device-independent framework, an experiment is regarded as a black box where the experimentalist introduces an input (the type of experiment) and obtains an output (the outcome of the experiment). Experiments conducted by two or more parties in separate labs are hence described by their statistical correlations alone.

From Bell's theorem, we know that classical and quantum physics predict different sets of allowed correlations. It is expected, therefore, that far-from-quantum physical theories should predict correlations beyond the quantum set. In fact, there exist instances of theoretical non-quantum correlations which, a priori, do not seem physically implausible. [19] [20] [21] The aim of device-independent reconstructions is to show that all such supra-quantum examples are precluded by a reasonable physical principle.

The physical principles proposed so far include no-signalling, [21] Non-Trivial Communication Complexity, [22] No-Advantage for Nonlocal computation, [23] Information Causality, [24] Macroscopic Locality, [25] and Local Orthogonality. [26] All these principles limit the set of possible correlations in non-trivial ways. Moreover, they are all device-independent: this means that they can be falsified under the assumption that we can decide if two or more events are space-like separated. The drawback of the device-independent approach is that, even when taken together, all the afore-mentioned physical principles do not suffice to single out the set of quantum correlations. [27] In other words: all such reconstructions are partial.

## Interpretations of quantum theory

An interpretation of quantum theory is a correspondence between the elements of its mathematical formalism and physical phenomena. For instance, in the pilot wave theory, the quantum wave function is interpreted as a field that guides the particle trajectory and evolves with it via a system of coupled differential equations. Most interpretations of quantum theory stem from the desire to solve the quantum measurement problem.

## Extensions of quantum theory

In an attempt to reconcile quantum and classical physics, or to identify non-classical models with a dynamical causal structure, some modifications of quantum theory have been proposed.

### Collapse models

Collapse models posit the existence of natural processes which periodically localize the wave-function. [28] Such theories provide an explanation to the nonexistence of superpositions of macroscopic objects, at the cost of abandoning unitarity and exact energy conservation.

### Quantum Measure Theory

In Sorkin's quantum measure theory (QMT), physical systems are not modeled via unitary rays and Hermitian operators, but through a single matrix-like object, the decoherence functional. [29] The entries of the decoherence functional determine the feasibility to experimentally discriminate between two or more different sets of classical histories, as well as the probabilities of each experimental outcome. In some models of QMT the decoherence functional is further constrained to be positive semidefinite (strong positivity). Even under the assumption of strong positivity, there exist models of QMT which generate stronger-than-quantum Bell correlations. [30]

### Acausal quantum processes

The formalism of process matrices starts from the observation that, given the structure of quantum states, the set of feasible quantum operations follows from positivity considerations. Namely, for any linear map from states to probabilities one can find a physical system where this map corresponds to a physical measurement. Likewise, any linear transformation that maps composite states to states corresponds to a valid operation in some physical system. In view of this trend, it is reasonable to postulate that any high-order map from quantum instruments (namely, measurement processes) to probabilities should also be physically realizable. [31] Any such map is termed a process matrix. As shown by Oreshkov et al., [31] some process matrices describe situations where the notion of global causality breaks.

The starting point of this claim is the following mental experiment: two parties, Alice and Bob, enter a building and end up in separate rooms. The rooms have ingoing and outgoing channels from which a quantum system periodically enters and leaves the room. While those systems are in the lab, Alice and Bob are able to interact with them in any way; in particular, they can measure some of their properties.

Since Alice and Bob’s interactions can be modeled by quantum instruments, the statistics they observe when they apply one instrument or another are given by a process matrix. As it turns out, there exist process matrices which would guarantee that the measurement statistics collected by Alice and Bob is incompatible with Alice interacting with her system at the same time, before or after Bob, or any convex combination of these three situations. [31] Such processes are called acausal.

## Related Research Articles

The Einstein–Podolsky–Rosen paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen (EPR), with which they argued that the description of physical reality provided by quantum mechanics was incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing them. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.

Quantum entanglement is a physical phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics lacking in classical mechanics.

The de Broglie–Bohm theory, also known as the pilot wave theory, Bohmian mechanics, Bohm's interpretation, and the causal interpretation, is an interpretation of quantum mechanics. In addition to the wavefunction, it also postulates an actual configuration of particles exists even when unobserved. The evolution over time of the configuration of all particles is defined by a guiding equation. The evolution of the wave function over time is given by the Schrödinger equation. The theory is named after Louis de Broglie (1892–1987) and David Bohm (1917–1992).

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics "corresponds" to reality. Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.

Bell's theorem proves that quantum physics is incompatible with local hidden-variable theories. It was introduced by physicist John Stewart Bell in a 1964 paper titled "On the Einstein Podolsky Rosen Paradox", referring to a 1935 thought experiment that Albert Einstein, Boris Podolsky and Nathan Rosen used to argue that quantum physics is an "incomplete" theory. By 1935, it was already recognized that the predictions of quantum physics are probabilistic. Einstein, Podolsky and Rosen presented a scenario that, in their view, indicated that quantum particles, like electrons and photons, must carry physical properties or attributes not included in quantum theory, and the uncertainties in quantum theory's predictions were due to ignorance of these properties, later termed "hidden variables". Their scenario involves a pair of widely separated physical objects, prepared in such a way that the quantum state of the pair is entangled.

In quantum physics, a measurement is the testing or manipulation of a physical system in order to yield a numerical result. The predictions that quantum physics makes are in general probabilistic. The mathematical tools for making predictions about what measurement outcomes may occur were developed during the 20th century and make use of linear algebra and functional analysis.

In Bell tests, there may be problems of experimental design or set-up that affect the validity of the experimental findings. These problems are often referred to as "loopholes". See the article on Bell's theorem for the theoretical background to these experimental efforts. The purpose of the experiment is to test whether nature is best described using a local hidden variable theory or by the quantum entanglement theory of quantum mechanics.

A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. The experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. To date, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.

A local hidden-variable theory in the interpretation of quantum mechanics is a hidden-variable theory that has the added requirement of being consistent with local realism. It refers to all types of the theory that attempt to account for the probabilistic features of quantum mechanics by the mechanism of underlying inaccessible variables, with the additional requirement from local realism that distant events be independent, ruling out instantaneous interactions between separate events.

In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–Kochen–Specker theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the permissible types of hidden-variable theories, which try to explain the predictions of quantum mechanics in a context-independent way. The version of the theorem proved by Kochen and Specker also gave an explicit example for this constraint in terms of a finite number of state vectors.

In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.

Retrocausality, or backwards causation, is a concept of cause and effect in which an effect precedes its cause in time and so a later event affects an earlier one. In quantum physics, the distinction between cause and effect is not made at the most fundamental level and so time-symmetric systems can be viewed as causal or retrocausal. Philosophical considerations of time travel often address the same issues as retrocausality, as do treatments of the subject in fiction, but the two phenomena are distinct.

In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not admit an interpretation in terms of a local realistic theory. Quantum nonlocality has been experimentally verified under different physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore must also be nonlocal in this sense; quantum nonlocality is a property of the universe that is independent of our description of nature.

The Leggett–Garg inequality, named for Anthony James Leggett and Anupam Garg, is a mathematical inequality fulfilled by all macrorealistic physical theories. Here, macrorealism is a classical worldview defined by the conjunction of two postulates:

1. Macrorealism per se: "A macroscopic object, which has available to it two or more macroscopically distinct states, is at any given time in a definite one of those states."
2. Noninvasive measurability: "It is possible in principle to determine which of these states the system is in without any effect on the state itself, or on the subsequent system dynamics."

The Leggett inequalities, named for Anthony James Leggett, who derived them, are a related pair of mathematical expressions concerning the correlations of properties of entangled particles. They are fulfilled by a large class of physical theories based on particular non-local and realistic assumptions, that may be considered to be plausible or intuitive according to common physical reasoning.

In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum physical effects but do not necessarily involve quantum entanglement.

In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, of which the most prominent is QBism. QBism is an interpretation that takes an agent's actions and experiences as the central concerns of the theory. QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.

Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.

Sandu Popescu is a Romanian-British physicist working in the foundations of quantum mechanics and quantum information.

A generalized probabilistic theory (GPT) is a general framework to describe the operational features of arbitrary physical theories. A GPT must specify what kind of physical systems one can find in the lab, as well as rules to compute the outcome statistics of any experiment involving labeled preparations, transformations and measurements. The framework of GPTs has been used to define hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation. Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.

## References

1. Bell, J. S. (1964). "On the Einstein Podolsky Rosen Paradox" (PDF). Physics Physique Физика . 1 (3): 195–200. doi:.
2. Mermin, N. David (July 1993). "Hidden Variables and the Two Theorems of John Bell". Reviews of Modern Physics . 65 (3): 803–15. arXiv:.
3. Werner, R. F. (2014). "Comment on 'What Bell did'". Journal of Physics A . 47: 424011. doi:10.1088/1751-8113/47/42/424011.
4. Żukowski, M.; Brukner, Č. (2014). "Quantum non-locality—it ain't necessarily so...". Journal of Physics A . 47: 424009. arXiv:. doi:10.1088/1751-8113/47/42/424009.
5. Fritz, T. (2012). "Beyond Bell's Theorem: Correlation Scenarios". New Journal of Physics . 14: 103001. doi:.
6. Bancal, Jean-Daniel; Pironio, Stefano; Acín, Antonio; Liang, Yeong-Cherng; Scarani, Valerio; Gisin, Nicolas (2012). "Quantum nonlocality based on finite-speed causal influences leads to superluminal signaling". Nature Physics . 8: 867. doi:.
7. Spekkens, R. W. (2005). "Contextuality for preparations, transformations, and unsharp measurements". Physical Review A . 71 (5): 052108. arXiv:. doi:10.1103/PhysRevA.71.052108.
8. Harrigan, N.; R. W. Spekkens (2010). "Einstein, Incompleteness, and the Epistemic View of Quantum States". Foundations of Physics . 40 (2): 125–157. arXiv:. doi:10.1007/s10701-009-9347-0.
9. Pusey, M. F.; Barrett, J.; Rudolph, T. (2012). "On the reality of the quantum state". Nature Physics . 8 (6): 475–478. arXiv:. doi:10.1038/nphys2309.
10. Fuchs, C. A. (2010). "QBism, the Perimeter of Quantum Bayesianism". arXiv:.
11. Schlosshauer, M.; Kofler, J.; Zeilinger, A. (2013). "A snapshot of foundational attitudes toward quantum mechanics". Studies in History and Philosophy of Science Part B . 44 (3): 222–230. arXiv:. doi:10.1016/j.shpsb.2013.04.004.
12. Barnum, H.; Barrett, J.; Leifer, M.; Wilce, A. (2012). S. Abramsky and M. Mislove (ed.). Teleportation in General Probabilistic Theories. AMS Proceedings of Symposia in Applied Mathematics. American Mathematical Society, Providence.
13. Hardy, L. "Quantum Theory From Five Reasonable Axioms". arXiv:.
14. Dakic, B.; Brukner, Č. (2011). "Quantum Theory and Beyond: Is Entanglement Special?". In H. Halvorson (ed.). Deep Beauty: Understanding the Quantum World through Mathematical Innovation. Cambridge University Press. pp. 365–392.
15. Masanes, L.; Müller, M. (2011). "A derivation of quantum theory from physical requirements". New Journal of Physics . 13: 063001.
16. Chiribella, G.; D'Ariano, G. M.; Perinotti, P. (2011). "Informational derivation of Quantum Theory". Phys. Rev. A . 84: 012311.
17. D'Ariano, G. M.; Chiribella, G.; Perinotti, P. (2017). Quantum Theory from First Principles: An Informational Approach. Cambridge University Press. ISBN   9781107338340. OCLC   972460315.
18. Appleby, M.; Fuchs, C. A.; Stacey, B. C.; Zhu, H. (2017). "Introducing the Qplex: a novel arena for quantum theory". European Physical Journal D . 71: 197. arXiv:. Bibcode:2017EPJD...71..197A. doi:10.1140/epjd/e2017-80024-y.
19. Rastall, Peter (1985). "Locality, Bell's theorem, and quantum mechanics". Foundations of Physics . 15 (9): 963–972. doi:10.1007/bf00739036.
20. Khalfin, L.A.; Tsirelson, B. S. (1985). Lahti; et al. (eds.). Quantum and quasi-classical analogs of Bell inequalities. Symposium on the Foundations of Modern Physics. World Sci. Publ. pp. 441–460.
21. Popescu, S.; Rohrlich, D. (1994). "Nonlocality as an axiom". Foundations of Physics . 24 (3): 379–385. doi:10.1007/BF02058098.
22. Brassard, G; Buhrman, H; Linden, N; Methot, AA; Tapp, A; Unger, F (2006). "Limit on Nonlocality in Any World in Which Communication Complexity Is Not Trivial". Physical Review Letters . 96: 250401. arXiv:. doi:10.1103/PhysRevLett.96.250401.
23. Linden, N.; Popescu, S.; Short, A. J.; Winter, A. (2007). "Quantum Nonlocality and Beyond: Limits from Nonlocal Computation". Physical Review Letters . 99 (18): 180502. arXiv:. Bibcode:2007PhRvL..99r0502L. doi:10.1103/PhysRevLett.99.180502.
24. Pawlowski, M.; Paterek, T.; Kaszlikowski, D.; Scarani, V.; Winter, A.; Zukowski, M. (October 2009). "Information Causality as a Physical Principle". Nature . 461 (7267): 1101–1104. arXiv:. Bibcode:2009Natur.461.1101P. doi:10.1038/nature08400. PMID   19847260.
25. Navascués, M.; H. Wunderlich (2009). "A Glance Beyond the Quantum Model". Proc. R. Soc. A . 466 (2115): 881–890. doi:.
26. Fritz, T.; Sainz, A. B.; Augusiak, R.; Brask, J. B.; Chaves, R.; Leverrier, A.; Acín, A. (2013). "Local orthogonality as a multipartite principle for quantum correlations". Nature Communications . 4: 2263. arXiv:. Bibcode:2013NatCo...4.2263F. doi:10.1038/ncomms3263. PMID   23948952.
27. Navascués, M.; Guryanova, Y.; Hoban, M. J.; Acín, A. (2015). "Almost Quantum Correlations". Nature Communications . 6: 6288. arXiv:. Bibcode:2015NatCo...6.6288N. doi:10.1038/ncomms7288. PMID   25697645.
28. Ghirardi, G. C.; A. Rimini; T. Weber (1986). "Unified dynamics for microscopic and macroscopic systems". Physical Review D . 34: 470. doi:10.1103/PhysRevD.34.470.
29. Sorkin, R. D. (1994). "Quantum Mechanics as Quantum Measure Theory". Mod. Phys. Lett. A . 9: 3119–3128. arXiv:. doi:10.1142/S021773239400294X.
30. Dowker, F.; Henson, J.; Wallden, P. (2014). "A histories perspective on characterizing quantum non-locality". New Journal of Physics . 16. doi:.
31. Oreshkov, O.; Costa, F.; Brukner, C. (2012). "Quantum correlations with no causal order". Nature Communications . 3: 1092. doi:.