In quantum mechanics, quantum correlation is the expected value of the product of the alternative outcomes. In other words, it is the expected change in physical characteristics as one quantum system passes through an interaction site. In John Bell's 1964 paper that inspired the Bell test, it was assumed that the outcomes A and B could each only take one of two values, -1 or +1. It followed that the product, too, could only be -1 or +1, so that the average value of the product would be
where, for example, N++ is the number of simultaneous instances ("coincidences") of the outcome +1 on both sides of the experiment.
However, in actual experiments, detectors are not perfect and produce many null outcomes. The correlation can still be estimated using the sum of coincidences, since clearly zeros do not contribute to the average, but in practice, instead of dividing by Ntotal, it is customary to divide by
the total number of observed coincidences. The legitimacy of this method relies on the assumption that the observed coincidences constitute a fair sample of the emitted pairs.
Following local realist assumptions as in Bell's paper, the estimated quantum correlation converges after a sufficient number of trials to
where a and b are detector settings and λ is the hidden variable, drawn from a distribution ρ(λ).
The quantum correlation is the key statistic in the CHSH inequality and some of the other Bell inequalities, tests that open the way for experimental discrimination between quantum mechanics and local realism or local hidden-variable theory.
Quantum correlations give rise to various phenomena, including interference of particles separated in time. [1] [2]
Quantum entanglement is the physical phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
The de Broglie–Bohm theory, also known as the pilot wave theory, Bohmian mechanics, Bohm's interpretation, and the causal interpretation, is an interpretation of quantum mechanics. In addition to the wavefunction, it also postulates an actual configuration of particles exists even when unobserved. The evolution over time of the configuration of all particles is defined by a guiding equation. The evolution of the wave function over time is given by the Schrödinger equation. The theory is named after Louis de Broglie (1892–1987) and David Bohm (1917–1992).
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories. The "local" in this case refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields can only occur at speeds no greater than the speed of light. "Hidden variables" are hypothetical properties possessed by quantum particles, properties that are undetectable but still affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
In quantum mechanics, a density matrix is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed states. Mixed states arise in quantum mechanics in two different situations: first when the preparation of the system is not fully known, and thus one must deal with a statistical ensemble of possible preparations, and second when one wants to describe a physical system which is entangled with another, as its state can not be described by a pure state.
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics can not be reproduced by local hidden-variable theories. Experimental verification of violation of the inequalities is seen as experimental confirmation that nature cannot be described by local hidden-variables theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Bell's original inequality, is a constraint on the statistics of "coincidences" in a Bell test which is necessarily true if there exist underlying local hidden variables. This constraint can, on the other hand, be infringed by quantum mechanics.
The Greenberger–Horne–Zeilinger experiment or GHZ experiments are a class of physics experiments that may be used to generate starkly contrasting predictions from local hidden-variable theory and quantum mechanical theory, and permit immediate comparison with actual experimental results. A GHZ experiment is similar to a test of Bell's inequality, except using three or more entangled particles, rather than two. With specific settings of GHZ experiments, it is possible to demonstrate absolute contradictions between the predictions of local hidden variable theory and those of quantum mechanics, whereas tests of Bell's inequality only demonstrate contradictions of a statistical nature. The results of actual GHZ experiments agree with the predictions of quantum mechanics.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. The predictions that quantum physics makes are in general probabilistic. The mathematical tools for making predictions about what measurement outcomes may occur were developed during the 20th century and make use of linear algebra and functional analysis.
A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. The experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. To date, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.
In the interpretation of quantum mechanics, a local hidden-variable theory is a hidden-variable theory that satisfies the condition of being consistent with local realism. This includes all types of the theory that attempt to account for the probabilistic features of quantum mechanics by the mechanism of underlying inaccessible variables, with the additional requirement from local realism that distant events be independent, ruling out instantaneous interactions between separate events.
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is
In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–Kochen–Specker theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the permissible types of hidden-variable theories, which try to explain the predictions of quantum mechanics in a context-independent way. The version of the theorem proved by Kochen and Specker also gave an explicit example for this constraint in terms of a finite number of state vectors.
The Born rule is a key postulate of quantum mechanics which gives the probability that a measurement of a quantum system will yield a given result. In its simplest form, it states that the probability density of finding a system in a given state, when measured, is proportional to the square of the amplitude of the system's wavefunction at that state. It was formulated by German physicist Max Born in 1926.
Squashed entanglement, also called CMI entanglement, is an information theoretic measure of quantum entanglement for a bipartite quantum system. If is the density matrix of a system composed of two subsystems and , then the CMI entanglement of system is defined by
In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not admit an interpretation in terms of a local realistic theory. Quantum nonlocality has been experimentally verified under different physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore cannot fulfill local realism; quantum nonlocality is a property of the universe that is independent of our description of nature.
In quantum mechanics, superdeterminism is a loophole in Bell's theorem. By postulating that all systems being measured are correlated with the choices of which measurements to make on them, the assumptions of the theorem are no longer fulfilled. A hidden variables theory which is superdeterministic, hence, can fulfill Bell's notion of local causality and still violate the inequalities derived from Bell's theorem. This makes it possible to construct a local hidden-variable theory that reproduces the predictions of quantum mechanics, for which a few toy models have been proposed. The term superdeterminism is misleading. Superdeterministic models are deterministic in the usual sense. But in addition to being deterministic, they also postulate correlations between the state that is measured and the measurement setting.
The Koopman–von Neumann mechanics is a description of classical mechanics in terms of Hilbert space, introduced by Bernard Koopman and John von Neumann in 1931 and 1932, respectively.
Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways. The min-entropy is never greater than the ordinary or Shannon entropy and that in turn is never greater than the Hartley or max-entropy, defined as the logarithm of the number of outcomes with nonzero probability.
In physics, in the area of quantum information theory and quantum computation, quantum steering is a special kind of nonlocal correlation, which is intermediate between Bell nonlocality and quantum entanglement. A state exhibiting Bell nonlocality must also exhibit quantum steering, a state exhibiting quantum steering must also exhibit quantum entanglement. But for mixed quantum states, there exist examples which lie between these different quantum correlation sets. The notion was initially proposed by Schrödinger, and later made popular by Howard M. Wiseman, S. J. Jones, and A. C. Doherty.