Generalized probabilistic theory

Last updated

A generalized probabilistic theory (GPT) is a general framework to describe the operational features of arbitrary physical theories. A GPT must specify what kind of physical systems one can find in the lab, as well as rules to compute the outcome statistics of any experiment involving labeled preparations, transformations and measurements. The framework of GPTs has been used to define hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement [1] [2] or teleportation. [3] Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory. [4] [5] [6] [7]

Contents

The mathematical formalism of GPTs has been developed since the 1950s and 1960s by many authors, and rediscovered independently several times. The earliest ideas are due to Segal [8] and Mackey, [9] although the first comprehensive and mathematically rigorous treatment can be traced back to the work of Ludwig, Dähn, and Stolz, all three based at the University of Marburg. [10] [11] [12] [13] [14] [15] While the formalism in these earlier works is less similar to the modern one, already in the early 1970s the ideas of the Marburg school had matured and the notation had developed towards the modern usage, thanks also to the independent contribution of Davies and Lewis. [16] [17] The books by Ludwig and the proceedings of a conference held in Marburg in 1973 offer a comprehensive account of these early developments. [18] [4] The term "generalized probabilistic theory" itself was coined by Jonathan Barrett in 2007, [19] based on the version of the framework introduced by Lucien Hardy. [5]

Note that some authors use the term operational probabilistic theory (OPT). [6] [20] OPTs are an alternative way to define hypothetical non-quantum physical theories, based on the language of category theory, in which one specify the axioms that should be satisfied by observations.

Definition

A GPT is specified by a number of mathematical structures, namely:

It can be argued that if one can prepare a state and a different state , then one can also toss a (possibly biased) coin which lands on one side with probability and on the other with probability and prepare either or , depending on the side the coin lands on. The resulting state is a statistical mixture of the states and and in GPTs such statistical mixtures are described by convex combinations, in this case . For this reason all state spaces are assumed to be convex sets. Following a similar reasoning, one can argue that also the set of measurement outcomes and set of physical operations must be convex.

Additionally it is always assumed that measurement outcomes and physical operations are affine maps, i.e. that if is a physical transformation, then we must have

and similarly for measurement outcomes. This follows from the argument that we should obtain the same outcome if we first prepare a statistical mixture and then apply the physical operation, or if we prepare a statistical mixture of the outcomes of the physical operations.

Note that physical operations are a subset of all affine maps which transform states into states as we must require that a physical operation yields a valid state even when it is applied to a part of a system (the notion of "part" is subtle: it is specified by explaining how different system types compose and how the global parameters of the composite system are affected by local operations).

For practical reasons it is often assumed that a general GPT is embedded in a finite-dimensional vector space, although infinite-dimensional formulations exist. [21] [22]

Classical, quantum, and beyond

Classical theory is a GPT where states correspond to probability distributions and both measurements and physical operations are stochastic maps. One can see that in this case all state spaces are simplexes.

Standard quantum information theory is a GPT where system types are described by a natural number which corresponds to the complex Hilbert space dimension. States of the systems of Hilbert space dimension are described by the normalized positive semidefinite matrices, i.e. by the density matrices. Measurements are identified with Positive Operator valued Measures (POVMs), and the physical operations are completely positive maps. Systems compose via the tensor product of the underlying complex Hilbert spaces.

Real quantum theory is the GPT which is obtained from standard quantum information theory by restricting the theory to real Hilbert spaces. It does not satisfy the axiom of local tomography [23] .

The framework of GPTs has provided examples of consistent physical theories which cannot be embedded in quantum theory and indeed exhibit very non-quantum features. One of the first ones was Box-world, the theory with maximal non-local correlations. [19] Other examples are theories with third-order interference [24] and the family of GPTs known as generalized bits. [25]

Many features that were considered purely quantum are actually present in all non-classical GPTs. These include the impossibility of universal broadcasting, i.e., the no-cloning theorem; [26] the existence of incompatible measurements; [22] [27] and the existence of entangled states or entangled measurements. [1] [2]

See also

Related Research Articles

<span class="mw-page-title-main">Quantum entanglement</span> Correlation between quantum systems

Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.

Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are putative properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."

In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.

<span class="mw-page-title-main">Quantum Zeno effect</span> Quantum measurement phenomenon

The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle's time evolution to be slowed down by measuring it frequently enough with respect to some chosen measurement setting.

In the interpretation of quantum mechanics, a local hidden-variable theory is a hidden-variable theory that satisfies the principle of locality. These models attempt to account for the probabilistic features of quantum mechanics via the mechanism of underlying, but inaccessible variables, with the additional requirement that distant events be statistically independent.

<span class="mw-page-title-main">Wheeler–DeWitt equation</span> Field equation, part of a theory that attempts to combine quantum mechanics and general relativity

The Wheeler–DeWitt equation for theoretical physics and applied mathematics, is a field equation attributed to John Archibald Wheeler and Bryce DeWitt. The equation attempts to mathematically combine the ideas of quantum mechanics and general relativity, a step towards a theory of quantum gravity.

In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–Kochen–Specker theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the permissible types of hidden-variable theories, which try to explain the predictions of quantum mechanics in a context-independent way. The version of the theorem proved by Kochen and Specker also gave an explicit example for this constraint in terms of a finite number of state vectors.

The Born rule is a postulate of quantum mechanics that gives the probability that a measurement of a quantum system will yield a given result. In its simplest form, it states that the probability density of finding a system in a given state, when measured, is proportional to the square of the amplitude of the system's wavefunction at that state. It was formulated and published by German physicist Max Born in July, 1926.

In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.

In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not allow an interpretation with local realism. Quantum nonlocality has been experimentally verified under a variety of physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore cannot fulfill local realism; quantum nonlocality is a property of the universe that is independent of our description of nature.

Objective-collapse theories, also known as models of spontaneous wave function collapse or dynamical reduction models, are proposed solutions to the measurement problem in quantum mechanics. As with other theories called interpretations of quantum mechanics, they are possible explanations of why and how quantum measurements always give definite outcomes, not a superposition of them as predicted by the Schrödinger equation, and more generally how the classical world emerges from quantum theory. The fundamental idea is that the unitary evolution of the wave function describing the state of a quantum system is approximate. It works well for microscopic systems, but progressively loses its validity when the mass / complexity of the system increases.

The Ghirardi–Rimini–Weber theory (GRW) is a spontaneous collapse theory in quantum mechanics, proposed in 1986 by Giancarlo Ghirardi, Alberto Rimini, and Tullio Weber.

In theoretical physics, the logarithmic Schrödinger equation is one of the nonlinear modifications of Schrödinger's equation. It is a classical wave equation with applications to extensions of quantum mechanics, quantum optics, nuclear physics, transport and diffusion phenomena, open quantum systems and information theory, effective quantum gravity and physical vacuum models and theory of superfluidity and Bose–Einstein condensation. Its relativistic version was first proposed by Gerald Rosen. It is an example of an integrable model.

The two-state vector formalism (TSVF) is a description of quantum mechanics in terms of a causal relation in which the present is caused by quantum states of the past and of the future taken in combination.

The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.

Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.

The Lieb–Robinson bound is a theoretical upper limit on the speed at which information can propagate in non-relativistic quantum systems. It demonstrates that information cannot travel instantaneously in quantum theory, even when the relativity limits of the speed of light are ignored. The existence of such a finite speed was discovered mathematically by Elliott H. Lieb and Derek W. Robinson in 1972. It turns the locality properties of physical systems into the existence of, and upper bound for this speed. The bound is now known as the Lieb–Robinson bound and the speed is known as the Lieb–Robinson velocity. This velocity is always finite but not universal, depending on the details of the system under consideration. For finite-range, e.g. nearest-neighbor, interactions, this velocity is a constant independent of the distance travelled. In long-range interacting systems, this velocity remains finite, but it can increase with the distance travelled.

In quantum information theory, a quantum catalyst is a special ancillary quantum state whose presence enables certain local transformations that would otherwise be impossible. Quantum catalytic behaviour has been shown to arise from the phenomenon of catalytic majorization.

Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.

Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.

References

  1. 1 2 Aubrun, Guillaume; Lami, Ludovico; Palazuelos, Carlos; Plávala, Martin (2021). "Entangleability of cones". Geometric and Functional Analysis. 31 (2): 181–205. arXiv: 1911.09663 . doi:10.1007/s00039-021-00565-5. S2CID   208202463.
  2. 1 2 Aubrun, Guillaume; Lami, Ludovico; Palazuelos, Carlos; Plávala, Martin (2022). "Entanglement and Superposition Are Equivalent Concepts in Any Physical Theory". Physical Review Letters. 128 (16): 160402. arXiv: 2109.04446 . Bibcode:2022PhRvL.128p0402A. doi:10.1103/PhysRevLett.128.160402. ISSN   0031-9007. PMID   35522482. S2CID   237453629.
  3. Barnum, H. and Barrett, J. and Leifer, M. and Wilce, A. (2012). "Teleportation in general probabilistic theories". Proceedings of Symposia in Applied Mathematics. 71: 25–48.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  4. 1 2 Ludwig, Günther (2012-12-06). An Axiomatic Basis for Quantum Mechanics: Volume 1 Derivation of Hilbert Space Structure. Springer Science & Business Media. ISBN   978-3-642-70029-3.
  5. 1 2 Hardy, L. (2001). "Quantum Theory From Five Reasonable Axioms". arXiv: quant-ph/0101012 .
  6. 1 2 Chiribella, Giulio; D’Ariano, Giacomo Mauro; Perinotti, Paolo (2011-07-11). "Informational derivation of quantum theory". Physical Review A. 84 (1): 012311. arXiv: 1011.6451 . Bibcode:2011PhRvA..84a2311C. doi:10.1103/PhysRevA.84.012311. ISSN   1050-2947. S2CID   15364117.
  7. Wetering, John van de (2019-12-18). "An effect-theoretic reconstruction of quantum theory". Compositionality. 1: 1. arXiv: 1801.05798 . doi: 10.32408/compositionality-1-1 . ISSN   2631-4444.
  8. Segal, I. E. (1947). "Postulates for General Quantum Mechanics". Annals of Mathematics. 48 (4): 930–948. doi:10.2307/1969387. ISSN   0003-486X. JSTOR   1969387.
  9. Mackey, George W (1960). Lecture notes on the mathematical foundations of quantum mechanics. Cambridge: Harvard Univ.
  10. Ludwig, Günther (1964). "Versuch einer axiomatischen Grundlegung der Quantenmechanik und allgemeinerer physikalischer Theorien". Zeitschrift für Physik. 181 (3): 233–260. Bibcode:1964ZPhy..181..233L. doi:10.1007/BF01418533. ISSN   0044-3328. S2CID   122711102.
  11. Ludwig, Günther (1967). "Attempt of an axiomatic foundation of quantum mechanics and more general theories, II". Communications in Mathematical Physics. 4 (5): 331–348. Bibcode:1967CMaPh...4..331L. doi:10.1007/BF01653647. ISSN   0010-3616. S2CID   189830778.
  12. Ludwig, Günther (1968). "Attempt of an axiomatic foundation of quantum mechanics and more general theories. III". Communications in Mathematical Physics. 9 (1): 1–12. Bibcode:1968CMaPh...9....1L. doi:10.1007/BF01654027. ISSN   0010-3616. S2CID   122753703.
  13. Dähn, Günter (1968). "Attempt of an axiomatic foundation of quantum mechanics and more general theories. IV". Communications in Mathematical Physics. 9 (3): 192–211. Bibcode:1968CMaPh...9..192D. doi:10.1007/BF01645686. ISSN   0010-3616. S2CID   121449647.
  14. Stolz, Peter (1969). "Attempt of an axiomatic foundation of quantum mechanics and more general theories V". Communications in Mathematical Physics. 11 (4): 303–313. doi:10.1007/BF01645851. ISSN   0010-3616. S2CID   120289511.
  15. Stolz, Peter (1971). "Attempt of an axiomatic foundation of quantum mechanics and more general theories VI". Communications in Mathematical Physics. 23 (2): 117–126. Bibcode:1971CMaPh..23..117S. doi:10.1007/BF01877753. ISSN   0010-3616. S2CID   189836530.
  16. Ludwig, Günther (1972). "An improved formulation of some theorems and axioms in the axiomatic foundation of the Hilbert space structure of quantum mechanics". Communications in Mathematical Physics. 26 (1): 78–86. Bibcode:1972CMaPh..26...78L. doi:10.1007/BF01877548. ISSN   0010-3616. S2CID   120013555.
  17. Davies, E. B.; Lewis, J. T. (1970). "An operational approach to quantum probability". Communications in Mathematical Physics. 17 (3): 239–260. Bibcode:1970CMaPh..17..239D. doi:10.1007/BF01647093. S2CID   120519304.
  18. Hartkämper, A; Neumann, H (1974). Foundations of Quantum Mechanics and Ordered Linear Spaces: Advanced Study Institute Marburg 1973. Berlin, Heidelberg: Springer-Verlag : Springer e-books. ISBN   978-3-540-38650-6.
  19. 1 2 Barrett, J. (2007). "Information processing in generalized probabilistic theories". Phys. Rev. A. 75 (3): 032304. arXiv: quant-ph/0508211 . Bibcode:2007PhRvA..75c2304B. doi:10.1103/PhysRevA.75.032304. S2CID   119504263.
  20. Chiribella, Giulio; D’Ariano, Giacomo Mauro; Perinotti, Paolo (2010-06-30). "Probabilistic theories with purification". Physical Review A. 81 (6): 062348. arXiv: 0908.1583 . Bibcode:2010PhRvA..81f2348C. doi:10.1103/PhysRevA.81.062348. S2CID   35205469.
  21. Nuida, Koji; Kimura, Gen; Miyadera, Takayuki (September 2010). "Optimal observables for minimum-error state discrimination in general probabilistic theories". Journal of Mathematical Physics. 51 (9): 093505. arXiv: 0906.5419 . Bibcode:2010JMP....51i3505N. doi:10.1063/1.3479008. ISSN   0022-2488. S2CID   16911930.
  22. 1 2 Kuramochi, Yui (2020-02-17). "Compatibility of any pair of 2-outcome measurements characterizes the Choquet simplex". Positivity. 24 (5): 1479–1486. arXiv: 1912.00563 . doi:10.1007/s11117-020-00742-0. ISSN   1385-1292. S2CID   208527451.
  23. Chiribella, Giulio; D’Ariano, Giacomo Mauro; Perinotti, Paolo (2010-06-30). "Probabilistic theories with purification". Physical Review A. 81 (6): 062348. arXiv: 0908.1583 . Bibcode:2010PhRvA..81f2348C. doi:10.1103/PhysRevA.81.062348. S2CID   35205469.
  24. Dakić, B.; Paterek, T.; Brukner, C. (2014). "Density cubes and higher-order interference theories". New J. Phys. 16 (2): 023028. arXiv: 1308.2822 . Bibcode:2014NJPh...16b3028D. doi: 10.1088/1367-2630/16/2/023028 .
  25. Pawłowski, M.; Winter, A. (2012). ""Hyperbits": The information quasiparticles". Phys. Rev. A. 85 (2): 022331. arXiv: 1106.2409 . Bibcode:2012PhRvA..85b2331P. doi:10.1103/PhysRevA.85.022331. S2CID   119269862.
  26. Barnum, Howard; Barrett, Jonathan; Leifer, Matthew; Wilce, Alexander (2007-12-13). "Generalized No-Broadcasting Theorem". Physical Review Letters. 99 (24): 240501. arXiv: 0707.0620 . Bibcode:2007PhRvL..99x0501B. doi:10.1103/PhysRevLett.99.240501. ISSN   0031-9007. PMID   18233430. S2CID   20228165.
  27. Plávala, Martin (2016-10-12). "All measurements in a probabilistic theory are compatible if and only if the state space is a simplex". Physical Review A. 94 (4): 042108. arXiv: 1608.05614 . Bibcode:2016PhRvA..94d2108P. doi:10.1103/PhysRevA.94.042108. ISSN   2469-9926. S2CID   119115973.