Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured (the measurement context). More formally, the measurement result (assumed pre-existing) of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
Contextuality was first demonstrated to be a feature of quantum phenomenology by the Bell–Kochen–Specker theorem. [1] [2] The study of contextuality has developed into a major topic of interest in quantum foundations as the phenomenon crystallises certain non-classical and counter-intuitive aspects of quantum theory. A number of powerful mathematical frameworks have been developed to study and better understand contextuality, from the perspective of sheaf theory, [3] graph theory, [4] hypergraphs, [5] algebraic topology, [6] and probabilistic couplings. [7]
Nonlocality, in the sense of Bell's theorem, may be viewed as a special case of the more general phenomenon of contextuality, in which measurement contexts contain measurements that are distributed over spacelike separated regions. This follows from Fine's theorem. [8] [3]
Quantum contextuality has been identified as a source of quantum computational speedups and quantum advantage in quantum computing. [9] [10] [11] [12] Contemporary research has increasingly focused on exploring its utility as a computational resource.
The need for contextuality was discussed informally in 1935 by Grete Hermann, [13] but it was more than 30 years later when Simon B. Kochen and Ernst Specker, and separately John Bell, constructed proofs that any realistic hidden-variable theory able to explain the phenomenology of quantum mechanics is contextual for systems of Hilbert space dimension three and greater. The Kochen–Specker theorem proves that realistic noncontextual hidden-variable theories cannot reproduce the empirical predictions of quantum mechanics. [14] Such a theory would suppose the following.
In addition, Kochen and Specker constructed an explicitly noncontextual hidden-variable model for the two-dimensional qubit case in their paper on the subject, [1] thereby completing the characterisation of the dimensionality of quantum systems that can demonstrate contextual behaviour. Bell's proof invoked a weaker version of Gleason's theorem, reinterpreting the theorem to show that quantum contextuality exists only in Hilbert space dimension greater than two. [2]
The sheaf-theoretic, or Abramsky–Brandenburger, approach to contextuality initiated by Samson Abramsky and Adam Brandenburger is theory-independent and can be applied beyond quantum theory to any situation in which empirical data arises in contexts. As well as being used to study forms of contextuality arising in quantum theory and other physical theories, it has also been used to study formally equivalent phenomena in logic, [15] relational databases, [16] natural language processing, [17] and constraint satisfaction. [18]
In essence, contextuality arises when empirical data is locally consistent but globally inconsistent.
This framework gives rise in a natural way to a qualitative hierarchy of contextuality:
Each level in this hierarchy strictly includes the next. An important intermediate level that lies strictly between the logical and strong contextuality classes is all-versus-nothing contextuality, [15] a representative example of which is the Greenberger–Horne–Zeilinger proof of nonlocality.
Adán Cabello, Simone Severini, and Andreas Winter introduced a general graph-theoretic framework for studying contextuality of different physical theories. [19] Within this framework experimental scenarios are described by graphs, and certain invariants of these graphs were shown have particular physical significance. One way in which contextuality may be witnessed in measurement statistics is through the violation of noncontextuality inequalities (also known as generalized Bell inequalities). With respect to certain appropriately normalised inequalities, the independence number, Lovász number, and fractional packing number of the graph of an experimental scenario provide tight upper bounds on the degree to which classical theories, quantum theory, and generalised probabilistic theories, respectively, may exhibit contextuality in an experiment of that kind. A more refined framework based on hypergraphs rather than graphs is also used. [5]
In the CbD approach, [20] [21] [22] developed by Ehtibar Dzhafarov, Janne Kujala, and colleagues, (non)contextuality is treated as a property of any system of random variables, defined as a set in which each random variable is labeled by its content – the property it measures, and its context – the set of recorded circumstances under which it is recorded (including but not limited to which other random variables it is recorded together with); stands for " is measured in ". The variables within a context are jointly distributed, but variables from different contexts are stochastically unrelated, defined on different sample spaces. A (probabilistic) coupling of the system is defined as a system in which all variables are jointly distributed and, in any context , and are identically distributed. The system is considered noncontextual if it has a coupling such that the probabilities are maximal possible for all contexts and contents such that . If such a coupling does not exist, the system is contextual. For the important class of cyclic systems of dichotomous () random variables, (), it has been shown [23] [24] that such a system is noncontextual if and only if where and with the maximum taken over all whose product is . If and , measuring the same content in different context, are always identically distributed, the system is called consistently connected (satisfying "no-disturbance" or "no-signaling" principle). Except for certain logical issues, [7] [21] in this case CbD specializes to traditional treatments of contextuality in quantum physics. In particular, for consistently connected cyclic systems the noncontextuality criterion above reduces to which includes the Bell/CHSH inequality (), KCBS inequality (), and other famous inequalities. [25] That nonlocality is a special case of contextuality follows in CbD from the fact that being jointly distributed for random variables is equivalent to being measurable functions of one and the same random variable (this generalizes Arthur Fine's analysis of Bell's theorem). CbD essentially coincides with the probabilistic part of Abramsky's sheaf-theoretic approach if the system is strongly consistently connected, which means that the joint distributions of and coincide whenever are measured in contexts . However, unlike most approaches to contextuality, CbD allows for inconsistent connectedness, with and differently distributed. This makes CbD applicable to physics experiments in which no-disturbance condition is violated, [24] [26] as well as to human behavior where this condition is violated as a rule. [27] In particular, Vctor Cervantes, Ehtibar Dzhafarov, and colleagues have demonstrated that random variables describing certain paradigms of simple decision making form contextual systems, [28] [29] [30] whereas many other decision-making systems are noncontextual once their inconsistent connectedness is properly taken into account. [27]
An extended notion of contextuality due to Robert Spekkens applies to preparations and transformations as well as to measurements, within a general framework of operational physical theories. [31] With respect to measurements, it removes the assumption of determinism of value assignments that is present in standard definitions of contextuality. This breaks the interpretation of nonlocality as a special case of contextuality, and does not treat irreducible randomness as nonclassical. Nevertheless, it recovers the usual notion of contextuality when outcome determinism is imposed.
Spekkens' contextuality can be motivated using Leibniz's law of the identity of indiscernibles. The law applied to physical systems in this framework mirrors the entended definition of noncontextuality. This was further explored by Simmons et al, [32] who demonstrated that other notions of contextuality could also be motivated by Leibnizian principles, and could be thought of as tools enabling ontological conclusions from operational statistics.
Given a pure quantum state , Born's rule tells that the probability to obtain another state in a measurement is . However, such a number does not define a full probability distribution, i.e. values over a set of mutually exclusive events, summing up to 1. In order to obtain such a set one needs to specify a context, that is a complete set of commuting operators (CSCO), or equivalently a set of N orthogonal projectors that sum to identity, where is the dimension of the Hilbert space. Then one has as expected. In that sense, one can tell that a state vector alone is predictively incomplete, as long a context has not been specified. [33] The actual physical state, now defined by within a specified context, has been called a modality by Auffèves and Grangier [34] [35]
Since it is clear that alone does not define a modality, what is its status ? If , one sees easily that is associated with an equivalence class of modalities, belonging to different contexts, but connected between themselves with certainty, even if the different CSCO observables do not commute. This equivalence class is called an extravalence class, and the associated transfer of certainty between contexts is called extracontextuality. As a simple example, the usual singlet state for two spins 1/2 can be found in the (non commuting) CSCOs associated with the measurement of the total spin (with ), or with a Bell measurement, and actually it appears in infinitely many different CSCOs - but obviously not in all possible ones. [36]
The concepts of extravalence and extracontextuality are very useful to spell out the role of contextuality in quantum mechanics, that is not non-contextual (like classical physical would be), but not either fully contextual, since modalities belonging to incompatible (non-commuting) contexts may be connected with certainty. Starting now from extracontextuality as a postulate, the fact that certainty can be transferred between contexts, and is then associated with a given projector, is the very basis of the hypotheses of Gleason's theorem, and thus of Born's rule. [37] [38] Also, associating a state vector with an extravalence class clarifies its status as a mathematical tool to calculate probabilities connecting modalities, which correspond to the actual observed physical events or results. This point of view is quite useful, and it can be used everywhere in quantum mechanics.
A form of contextuality that may present in the dynamics of a quantum system was introduced by Shane Mansfield and Elham Kashefi, and has been shown to relate to computational quantum advantages. [39] As a notion of contextuality that applies to transformations it is inequivalent to that of Spekkens. Examples explored to date rely on additional memory constraints which have a more computational than foundational motivation. Contextuality may be traded-off against Landauer erasure to obtain equivalent advantages. [40]
The Kochen–Specker theorem proves that quantum mechanics is incompatible with realistic noncontextual hidden variable models. On the other hand Bell's theorem proves that quantum mechanics is incompatible with factorisable hidden variable models in an experiment in which measurements are performed at distinct spacelike separated locations. Arthur Fine showed that in the experimental scenario in which the famous CHSH inequalities and proof of nonlocality apply, a factorisable hidden variable model exists if and only if a noncontextual hidden variable model exists. [8] This equivalence was proven to hold more generally in any experimental scenario by Samson Abramsky and Adam Brandenburger. [3] It is for this reason that we may consider nonlocality to be a special case of contextuality.
A number of methods exist for quantifying contextuality. One approach is by measuring the degree to which some particular noncontextuality inequality is violated, e.g. the KCBS inequality, the Yu–Oh inequality, [41] or some Bell inequality. A more general measure of contextuality is the contextual fraction. [11]
Given a set of measurement statistics e, consisting of a probability distribution over joint outcomes for each measurement context, we may consider factoring e into a noncontextual part eNC and some remainder e',
The maximum value of λ over all such decompositions is the noncontextual fraction of e denoted NCF(e), while the remainder CF(e)=(1-NCF(e)) is the contextual fraction of e. The idea is that we look for a noncontextual explanation for the highest possible fraction of the data, and what is left over is the irreducibly contextual part. Indeed, for any such decomposition that maximises λ the leftover e' is known to be strongly contextual. This measure of contextuality takes values in the interval [0,1], where 0 corresponds to noncontextuality and 1 corresponds to strong contextuality. The contextual fraction may be computed using linear programming.
It has also been proved that CF(e) is an upper bound on the extent to which e violates any normalised noncontextuality inequality. [11] Here normalisation means that violations are expressed as fractions of the algebraic maximum violation of the inequality. Moreover, the dual linear program to that which maximises λ computes a noncontextual inequality for which this violation is attained. In this sense the contextual fraction is a more neutral measure of contextuality, since it optimises over all possible noncontextual inequalities rather than checking the statistics against one inequality in particular.
Several measures of the degree of contextuality in contextual systems were proposed within the CbD framework, [22] but only one of them, denoted CNT2, has been shown to naturally extend into a measure of noncontextuality in noncontextual systems, NCNT2. This is important, because at least in the non-physical applications of CbD contextuality and noncontextuality are of equal interest. Both CNT2 and NCNT2 are defined as the -distance between a probability vector representing a system and the surface of the noncontextuality polytope representing all possible noncontextual systems with the same single-variable marginals. For cyclic systems of dichotomous random variables, it is shown [42] that if the system is contextual (i.e., ),
and if it is noncontextual ( ),
where is the -distance from the vector to the surface of the box circumscribing the noncontextuality polytope. More generally, NCNT2 and CNT2 are computed by means of linear programming. [22] The same is true for other CbD-based measures of contextuality. One of them, denoted CNT3, uses the notion of a quasi-coupling, that differs from a coupling in that the probabilities in the joint distribution of its values are replaced with arbitrary reals (allowed to be negative but summing to 1). The class of quasi-couplings maximizing the probabilities is always nonempty, and the minimal total variation of the signed measure in this class is a natural measure of contextuality. [43]
Recently, quantum contextuality has been investigated as a source of quantum advantage and computational speedups in quantum computing.
Magic state distillation is a scheme for quantum computing in which quantum circuits constructed only of Clifford operators, which by themselves are fault-tolerant but efficiently classically simulable, are injected with certain "magic" states that promote the computational power to universal fault-tolerant quantum computing. [44] In 2014, Mark Howard, et al. showed that contextuality characterizes magic states for qubits of odd prime dimension and for qubits with real wavefunctions. [45] Extensions to the qubit case have been investigated by Juani Bermejo Vega et al. [41] This line of research builds on earlier work by Ernesto Galvão, [40] which showed that Wigner function negativity is necessary for a state to be "magic"; it later emerged that Wigner negativity and contextuality are in a sense equivalent notions of nonclassicality. [46]
Measurement-based quantum computation (MBQC) is a model for quantum computing in which a classical control computer interacts with a quantum system by specifying measurements to be performed and receiving measurement outcomes in return. The measurement statistics for the quantum system may or may not exhibit contextuality. A variety of results have shown that the presence of contextuality enhances the computational power of an MBQC.
In particular, researchers have considered an artificial situation in which the power of the classical control computer is restricted to only being able to compute linear Boolean functions, i.e. to solve problems in the Parity L complexity class ⊕L. For interactions with multi-qubit quantum systems a natural assumption is that each step of the interaction consists of a binary choice of measurement which in turn returns a binary outcome. An MBQC of this restricted kind is known as an l2-MBQC. [47]
In 2009, Janet Anders and Dan Browne showed that two specific examples of nonlocality and contextuality were sufficient to compute a non-linear function. This in turn could be used to boost computational power to that of a universal classical computer, i.e. to solve problems in the complexity class P. [48] This is sometimes referred to as measurement-based classical computation. [49] The specific examples made use of the Greenberger–Horne–Zeilinger nonlocality proof and the supra-quantum Popescu–Rohrlich box.
In 2013, Robert Raussendorf showed more generally that access to strongly contextual measurement statistics is necessary and sufficient for an l2-MBQC to compute a non-linear function. He also showed that to compute non-linear Boolean functions with sufficiently high probability requires contextuality. [47]
A further generalization and refinement of these results due to Samson Abramsky, Rui Soares Barbosa and Shane Mansfield appeared in 2017, proving a precise quantifiable relationship between the probability of successfully computing any given non-linear function and the degree of contextuality present in the l2-MBQC as measured by the contextual fraction. [11] Specifically, where are the probability of success, the contextual fraction of the measurement statistics e, and a measure of the non-linearity of the function to be computed , respectively.
The Einstein–Podolsky–Rosen (EPR) paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen which argues that the description of physical reality provided by quantum mechanics is incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing these hidden variables. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are supposed properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
In physics, the principle of locality states that an object is influenced directly only by its immediate surroundings. A theory that includes the principle of locality is said to be a "local theory". This is an alternative to the concept of instantaneous, or "non-local" action at a distance. Locality evolved out of the field theories of classical physics. The idea is that for a cause at one point to have an effect at another point, something in the space between those points must mediate the action. To exert an influence, something, such as a wave or particle, must travel through the space between the two points, carrying the influence.
A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. Named for John Stewart Bell, the experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. The test empirically evaluates the implications of Bell's theorem. As of 2015, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.
In the interpretation of quantum mechanics, a local hidden-variable theory is a hidden-variable theory that satisfies the principle of locality. These models attempt to account for the probabilistic features of quantum mechanics via the mechanism of underlying, but inaccessible variables, with the additional requirement that distant events be statistically independent.
A Tsirelson bound is an upper limit to quantum mechanical correlations between distant events. Given that quantum mechanics violates Bell inequalities, a natural question to ask is how large can the violation be. The answer is precisely the Tsirelson bound for the particular Bell inequality in question. In general, this bound is lower than the bound that would be obtained if more general theories, only constrained by "no-signalling", were considered, and much research has been dedicated to the question of why this is the case.
Quantum tomography or quantum state tomography is the process by which a quantum state is reconstructed using measurements on an ensemble of identical quantum states. The source of these states may be any device or system which prepares quantum states either consistently into quantum pure states or otherwise into general mixed states. To be able to uniquely identify the state, the measurements must be tomographically complete. That is, the measured operators must form an operator basis on the Hilbert space of the system, providing all the information about the state. Such a set of observations is sometimes called a quorum. The term tomography was first used in the quantum physics literature in a 1993 paper introducing experimental optical homodyne tomography.
In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–KS theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the permissible types of hidden-variable theories, which try to explain the predictions of quantum mechanics in a context-independent way. The version of the theorem proved by Kochen and Specker also gave an explicit example for this constraint in terms of a finite number of state vectors.
In quantum computing, a graph state is a special type of multi-qubit state that can be represented by a graph. Each qubit is represented by a vertex of the graph, and there is an edge between every interacting pair of qubits. In particular, they are a convenient way of representing certain types of entangled states.
In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not allow an interpretation with local realism. Quantum nonlocality has been experimentally verified under a variety of physical assumptions.
Quantum nondemolition (QND) measurement is a special type of measurement of a quantum system in which the uncertainty of the measured observable does not increase from its measured value during the subsequent normal evolution of the system. This necessarily requires that the measurement process preserves the physical integrity of the measured system, and moreover places requirements on the relationship between the measured observable and the self-Hamiltonian of the system. In a sense, QND measurements are the "most classical" and least disturbing type of measurement in quantum mechanics.
In quantum information and quantum computing, a cluster state is a type of highly entangled state of multiple qubits. Cluster states are generated in lattices of qubits with Ising type interactions. A cluster C is a connected subset of a d-dimensional lattice, and a cluster state is a pure state of the qubits located on C. They are different from other types of entangled states such as GHZ states or W states in that it is more difficult to eliminate quantum entanglement in the case of cluster states. Another way of thinking of cluster states is as a particular instance of graph states, where the underlying graph is a connected subset of a d-dimensional lattice. Cluster states are especially useful in the context of the one-way quantum computer. For a comprehensible introduction to the topic see.
In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum physical effects but do not necessarily involve quantum entanglement.
The noisy-storage model refers to a cryptographic model employed in quantum cryptography. It assumes that the quantum memory device of an attacker (adversary) trying to break the protocol is imperfect (noisy). The main goal of this model is to enable the secure implementation of two-party cryptographic primitives, such as bit commitment, oblivious transfer and secure identification.
Boson sampling is a restricted model of non-universal quantum computation introduced by Scott Aaronson and Alex Arkhipov after the original work of Lidror Troyansky and Naftali Tishby, that explored possible usage of boson scattering to evaluate expectation values of permanents of matrices. The model consists of sampling from the probability distribution of identical bosons scattered by a linear interferometer. Although the problem is well defined for any bosonic particles, its photonic version is currently considered as the most promising platform for a scalable implementation of a boson sampling device, which makes it a non-universal approach to linear optical quantum computing. Moreover, while not universal, the boson sampling scheme is strongly believed to implement computing tasks which are hard to implement with classical computers by using far fewer physical resources than a full linear-optical quantum computing setup. This advantage makes it an ideal candidate for demonstrating the power of quantum computation in the near term.
Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.
The quantum Fisher information is a central quantity in quantum metrology and is the quantum analogue of the classical Fisher information. It is one of the central quantities used to qualify the utility of an input state, especially in Mach–Zehnder interferometer-based phase or parameter estimation. It is shown that the quantum Fisher information can also be a sensitive probe of a quantum phase transition. The quantum Fisher information of a state with respect to the observable is defined as
Incompatibility of quantum measurements is a crucial concept of quantum information, addressing whether two or more quantum measurements can be performed on a quantum system simultaneously. It highlights the unique and non-classical behavior of quantum systems. This concept is fundamental to the nature of quantum mechanics and has practical applications in various quantum information processing tasks like quantum key distribution and quantum metrology.