Aspect's experiment was the first quantum mechanics experiment to demonstrate the violation of Bell's inequalities with photons using distant detectors. Its 1982 result allowed for further validation of the quantum entanglement and locality principles. It also offered an experimental answer to Albert Einstein, Boris Podolsky, and Nathan Rosen's paradox which had been proposed about fifty years earlier.
It was the first experiment to remove the locality loophole, as it was able to modify the angle of the polarizers while the photons were in flight, faster than what light would take to reach the other polarizer, removing the possibility of communications between detectors.
The experiment was led by French physicist Alain Aspect at the Institut d'optique théorique et appliquée in Orsay between 1980 and 1982. Its importance was immediately recognized by the scientific community. Although the methodology carried out by Aspect presents a potential flaw, the detection loophole, his result is considered decisive and led to numerous other experiments (the so-called Bell tests) which confirmed Aspect's original experiment. [1]
For his work on this topic, Aspect was awarded part of the 2022 Nobel Prize in Physics. [2]
The Einstein–Podolsky–Rosen (EPR) paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen which argues that the description of physical reality provided by quantum mechanics is incomplete. [3] In the 1935 EPR paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing these hidden variables. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.
The thought experiment involves a pair of particles prepared in what would later become known as an entangled state. Einstein, Podolsky, and Rosen pointed out that, in this state, if the position of the first particle were measured, the result of measuring the position of the second particle could be predicted. If instead the momentum of the first particle were measured, then the result of measuring the momentum of the second particle could be predicted. They argued that no action taken on the first particle could instantaneously affect the other, since this would involve information being transmitted faster than light, which is forbidden by the theory of relativity. They invoked a principle, later known as the "EPR criterion of reality", positing that: "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of reality corresponding to that quantity." From this, they inferred that the second particle must have a definite value of both position and of momentum prior to either quantity being measured. But quantum mechanics considers these two observables incompatible and thus does not associate simultaneous values for both to any system. Einstein, Podolsky, and Rosen therefore concluded that quantum theory does not provide a complete description of reality. [4]
In 1964, Irish physicist John Stewart Bell carried the analysis of quantum entanglement much further. [5] He deduced that if measurements are performed independently on the two separated particles of an entangled pair, then the assumption that the outcomes depend upon hidden variables within each half implies a mathematical constraint on how the outcomes on the two measurements are correlated. This constraint would later be named the Bell inequalities. Bell then showed that quantum physics predicts correlations that violate this inequality. Consequently, the only way that hidden variables could explain the predictions of quantum physics is if they are "nonlocal", which is to say that somehow the two particles are able to influence one another instantaneously no matter how widely they ever become separated. [6] [7]
In 1969, John Clauser and Michael Horne, along with Horne's doctoral student Abner Shimony, and Francis Pinki's doctoral student Richard Holt, came up with the CHSH inequality, a reformulation of Bell inequality that could better tested with experiments. [8]
The first rudimentary experiment designed to test Bell's theorem was performed in 1972 by Clauser and Stuart Freedman at University of California, Berkeley. [9] In 1973, at Harvard University, Pipkin and Holt's experiments suggested the opposite conclusion, negating that quantum mechanics violates the Bell inequalities. [8] Edward S. Fry and Randall C. Thompson Texas A&M University, reattempted the experiment in 1973 and agreed with Clauser. [8] These experiments were only a limited test, because the choice of detector settings was made before the photons had left the source. [8]
Advised by John Bell, Alain Aspect worked to develop an experiment to remove this limitation. [8]
Alain Aspect completed his doctoral thesis in 1971 working on holography and then went abroad to teach in the École Normale in Cameroon. He returned to France in 1974 and joined the Institut d'optique in Orsay working for his habilitation thesis. Physicist Christian Imbert handled him various papers from Bell and Aspect worked for five year in the construction and preliminary tests for his experiment. [8] He published his first experimental results in 1981, and completed his habilitation in 1983 with the final results of his experiment. [8] The referees included André Maréchal and Christian Imbert from the Institut d'optique, Franck Laloë, Bernard d'Espagnat, Claude Cohen-Tannoudji, and John Bell. [8]
The illustration above represents the principle scheme from which John Bell demonstrated his inequalities: a source of entangled photons S simultaneously emits two and photons whose polarization is prepared so that both photons' state vector is:
This formula simply means that the photons are in a superposed state: they are in a linear combination of both photons vertically polarized plus both photons horizontally polarized, with an equal probability. These two photons are then measured using two polarizers P1 and P2, each with a configurable measuring angle: α and β. the result of each polarizer's measurement can be (+) or (−) according to whether the measured polarization is parallel or perpendicular to the polarizer's angle of measurement.
One noteworthy aspect is that the polarizers imagined for this ideal experiment give a measurable result both in the (−) and (+) situations. Not all real polarizers are able to do this: some detect the (+) situation for example, but are unable to detect anything in the (−) situation (the photon never leaves the polarizer). Early experiments used the latter sort of polarizer. Alain Aspect's polarizers resulted better able to detect both scenarios and therefore much closer to the ideal experiment.
Given the apparatus and the initial state of polarization given to the photons, quantum mechanics is able to predict the probabilities of measuring (+,+), (−,−), (+,−) and (−,+) on the polarizers (P1,P2), oriented on the (α,β) angles. As a reminder in quantum mechanics:
The quantity of interest is a correlation function given by [10]
with
where (α',β') are a set of different angles. According to the CHSH inequality,
a type of Bell inequality. However quantum mechanics predicts a maximal violation of this inequality for |α−β| = |α'−β| = |α'−β'| = 22.5° and |α−β' | = 67.5°.
In 1975, since a decisive experiment based on the violation of Bell's inequalities and verifying the veracity of quantum entanglement was still missing, Alain Aspect proposed in an article, an experiment meticulous enough to be irrefutable. [11] [12]
Alain Aspect specified his experiment so that it would be as decisive as possible. Namely:
Alain Aspect carried a three-round series of increasingly complex experiments from 1980 to 1981. The first round of experiments reproduced Clauser, Holt and Fry experimental tests. In the second round of experiments he added a two-channel polarizers which improved the efficiency of the detections. These two rounds of experiments carried this experiment with the help of research engineer Gérard Roger and physicist Philippe Grangier , undergraduate student at the time. [8]
The third round of experiments took place in 1982, and were carried in collaboration with Roger and physicist Jean Dalibard, a young student at the time. [8] This last round is the closest to the initial specifications, will be described here.
The first experiments testing Bell's inequalities possessed low-intensity photon sources and necessitated a continuous week to complete. One of Aspect's first improvements consisted in using a photon source several orders of magnitude more efficient. This source allowed a detection rate of 100 photons per second, thus shortening the length of the experiment to 100 seconds.
The source used is a calcium radiative cascade, excited with a krypton laser.
One of the main points of this experiment was to make sure that the correlation between the measurements P1 and P2 had not been the result of "classical" effects, especially experimental artefacts.
As an example, when P1 and P2 are prepared with fixed angles α and β, it can be surmised that this state generates parasitic correlations through current or mass loops, or some other effects. As a matter of fact, both polarizers belong to the same setup and could influence one another through the various circuits of the experimental device, and generate correlations upon measurement.
One can then imagine that the fixed orientation of the polarizers impacts, one way or the other, the state the photon couple is emitted with. In such a case, the correlations between the measurement results could be explained by local hidden variables within the photons, upon their emission. Alain Aspects had mentioned these observations to John Bell himself.[ citation needed ]
One way of ruling out these kinds of effects is to determine the (α,β) orientation of the polarizers at the last moment—after the photons have been emitted, and before their detection—and to keep them far enough from each other to prevent any signal from reaching any one of them.
This method assures that the orientation of the polarizers during the emission has no bearing on the result (since the orientation is yet undetermined during emission). It also assures that the polarizers do not influence each other, being too distant from one another.
As a consequence, Aspect's experimental set-up has polarizers P1 and P2 set 6 metres apart from the source, and 12 metres apart from one another. With this setup, only 20 nanoseconds elapse between the emission of the photons and their detection. During this extremely short period of time, the experimenter has to decide on the polarizers' orientation and to then orient them.
Since it is physically impossible to modify a polarizer's orientation within such a time span, two polarizers—one for each side—were used and pre-oriented in different directions. A high-frequency shunting randomly oriented towards one polarizer or the other. The setup corresponded to one polarizer with a randomly tilting polarization angle.
Since it was not possible either to have the emitted photons provoke the tilting, the polarizers shunted periodically every 10 nanoseconds (asynchronously with the photon's emission) thus assuring the referral device would tilt at least once between the emission of the photon and its detection.
Another important characteristic of the 1982 experiment was the use of two-channel polarizers which allowed a measurable result in situations (+) and (−). The polarizers used until Aspect's experiment could detect situation (+), but not situation (−). These single-channel polarizers had two major inconveniences:
The two-channel polarizers Aspect used in his experiment avoided these two inconveniences and allowed him to use Bell's formulas directly to calculate the inequalities.
Technically, the polarizers he used were polarizing cubes which transmitted one polarity and reflected the other one, emulating a Stern-Gerlach device.
Bell's inequalities establish a theoretical curve of the number of correlations (++ or −−) between the two detectors in relation to the relative angle of the detectors . The shape of the curve is characteristic of the violation of Bell's inequalities. The measures' matching the shape of the curve establishes, quantitatively and qualitatively, that Bell's inequalities have been violated.
All three of Aspect's experiments unambiguously confirmed the violation, as predicted by quantum mechanics, thus undermining Einstein's local realistic outlook on quantum mechanics and local hidden variable scenarios. In addition to being confirmed, the violation was confirmed in the exact way predicted by quantum mechanics, with a statistical agreement of up to 242 standard deviations. [13]
Given the technical quality of the experiment, the scrupulous avoidance of experimental artefacts, and the quasi-perfect statistical agreement, this experiment convinced the scientific community at large that quantum physics violates Bell's inequalities.
After the results, some physicists legitimately tried to look for flaws in Aspect's experiment and to find out how to improve it to resist criticism.
Some theoretical objections can be raised against the setup:
The ideal experiment, which would negate any imaginable possibility of induced correlations should:
The conditions of the experiment also suffered from a detection loophole. [1]
After 1982, physicists began to look for application of entanglement, this lead for the development of quantum computing and quantum cryptography. [8]
For his work on this topic, Aspect received several awards including the 2010 Wolf Prize in Physics and the 2022 Nobel Prize in Physics, both shared with John Clauser and Anton Zeilinger for their Bell tests. [2] [14]
The loopholes mentioned could only be solved from 1998. In the meantime, Aspect's experiment was reproduced, and the violation of Bell's inequalities was systematically confirmed, with a statistical certainty of up to 100 standard deviation.
Other experiments were conducted to test the violations of Bell's inequalities with other observables than polarization, in order to approach the original spirit of the EPR paradox, in which Einstein imagined measuring two combined variables (such as position and movement quantity) on an EPR pair. An experiment introduced the combined variables (time and energy) which, once again, confirmed quantum mechanics. [15]
In 1998, the Geneva experiment tested the correlation between two detectors set 30 kilometres apart using the Swiss optical fibre telecommunication network. [16] The distance gave more time to commute the angles of the polarizers. It was therefore possible to have a completely random shunting. Additionally, the two distant polarizers were entirely independent. The measurements were recorded on each side, and compared after the experiment by dating each measurement using an atomic clock. The violation of Bell's inequalities was once again verified under strict and practically ideal conditions. If Aspect's experiment implied that a hypothetical coordination signal travel twice as fast as the speed of light c, Geneva's reached 10 million times c.[ citation needed ]
An experiment took place at the National Institute of Standards and Technology (NIST) in 2000 on trapped-ion entanglement using a very efficient correlation-based detection method. [17] The reliability of detection proved to be sufficient for the experiment to violate Bell's inequalities on the whole, even though all detected correlations did not violate them.
In 2001, Antoine Suarez's team, which included Nicolas Gisin who had participated in the Geneva experiment, reproduced the experiment using mirrors or detectors in motion, allowing them to reverse the order of events across the frames of reference, in accordance with special relativity (this inversion is only possible for events without any causal relationship). The speeds are chosen so that when a photon is reflected or crosses the semi-transparent mirror, the other photon has already crossed or been reflected from the point of view of the frame of reference attached to the mirror. This an "after-after" configuration, in which sound waves play the role of semi-transparent mirrors.
In 2015 the first three significant-loophole-free Bell-tests were published within three months by independent groups in Delft University of Technology, University of Vienna and NIST. All three tests simultaneously addressed the detection loophole, the locality loophole, and the memory loophole. [8]
Prior to the Aspect experiments, Bell's theorem was mostly a niche topic. The publications by Aspect and collaborators prompted wider discussion of the subject. [18]
The fact that nature is found to violate Bell's inequality implies that one or more of the assumptions underlying that inequality must not hold true. Different interpretations of quantum mechanics provide different views on which assumptions ought to be rejected. [19] [20] [21] Copenhagen-type interpretations generally take the violation of Bell inequalities as grounds to reject the assumption often called counterfactual definiteness. [22] [23] [24] This is also the route taken by interpretations that descend from the Copenhagen tradition, such as consistent histories (often advertised as "Copenhagen done right"), [25] as well as QBism. [26] In contrast, the versions of the many-worlds interpretation all violate an implicit assumption by Bell that measurements have a single outcome. [27] Unlike all of these, the Bohmian or "pilot wave" interpretation abandons the assumption of locality: instantaneous communication can exist at the level of the hidden variables, but it cannot be used to send signals. [28]
The Einstein–Podolsky–Rosen (EPR) paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen which argues that the description of physical reality provided by quantum mechanics is incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing these hidden variables. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.
Quantum mechanics is a fundamental theory that describes the behavior of nature at and below the scale of atoms. It is the foundation of all quantum physics, which includes quantum chemistry, quantum field theory, quantum technology, and quantum information science.
Quantum teleportation is a technique for transferring quantum information from a sender at one location to a receiver some distance away. While teleportation is commonly portrayed in science fiction as a means to transfer physical objects from one location to the next, quantum teleportation only transfers quantum information. The sender does not have to know the particular quantum state being transferred. Moreover, the location of the recipient can be unknown, but to complete the quantum teleportation, classical information needs to be sent from sender to receiver. Because classical information needs to be sent, quantum teleportation cannot occur faster than the speed of light.
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are supposed properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
A timeline of atomic and subatomic physics, including particle physics.
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden-variable theories. Experimental verification of the inequality being violated is seen as confirmation that nature cannot be described by such theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Stewart Bell's original inequality, is a constraint—on the statistical occurrence of "coincidences" in a Bell test—which is necessarily true if an underlying local hidden-variable theory exists. In practice, the inequality is routinely violated by modern experiments in quantum mechanics.
John Stewart Bell FRS was a physicist from Northern Ireland and the originator of Bell's theorem, an important theorem in quantum physics regarding hidden-variable theories.
In physics, a hidden-variable theory is a deterministic physical model which seeks to explain the probabilistic nature of quantum mechanics by introducing additional variables.
In physics, specifically in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator, often described as a state that has dynamics most closely resembling the oscillatory behavior of a classical harmonic oscillator. It was the first example of quantum dynamics when Erwin Schrödinger derived it in 1926, while searching for solutions of the Schrödinger equation that satisfy the correspondence principle. The quantum harmonic oscillator arise in the quantum theory of a wide range of physical systems. For instance, a coherent state describes the oscillating motion of a particle confined in a quadratic potential well. The coherent state describes a state in a system for which the ground-state wavepacket is displaced from the origin of the system. This state can be related to classical solutions by a particle oscillating with an amplitude equivalent to the displacement.
In physics, the principle of locality states that an object is influenced directly only by its immediate surroundings. A theory that includes the principle of locality is said to be a "local theory". This is an alternative to the concept of instantaneous, or "non-local" action at a distance. Locality evolved out of the field theories of classical physics. The idea is that for a cause at one point to have an effect at another point, something in the space between those points must mediate the action. To exert an influence, something, such as a wave or particle, must travel through the space between the two points, carrying the influence.
A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. Named for John Stewart Bell, the experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. The test empirically evaluates the implications of Bell's theorem. As of 2015, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.
In the interpretation of quantum mechanics, a local hidden-variable theory is a hidden-variable theory that satisfies the principle of locality. These models attempt to account for the probabilistic features of quantum mechanics via the mechanism of underlying, but inaccessible variables, with the additional requirement that distant events be statistically independent.
The Bohr–Einstein debates were a series of public disputes about quantum mechanics between Albert Einstein and Niels Bohr. Their debates are remembered because of their importance to the philosophy of science, insofar as the disagreements—and the outcome of Bohr's version of quantum mechanics becoming the prevalent view—form the root of the modern understanding of physics. Most of Bohr's version of the events held in the Solvay Conference in 1927 and other places was first written by Bohr decades later in an article titled, "Discussions with Einstein on Epistemological Problems in Atomic Physics". Based on the article, the philosophical issue of the debate was whether Bohr's Copenhagen interpretation of quantum mechanics, which centered on his belief of complementarity, was valid in explaining nature. Despite their differences of opinion and the succeeding discoveries that helped solidify quantum mechanics, Bohr and Einstein maintained a mutual admiration that was to last the rest of their lives.
Alain Aspect is a French physicist noted for his experimental work on quantum entanglement.
Quantum mechanics is the study of matter and its interactions with energy on the scale of atomic and subatomic particles. By contrast, classical physics explains matter and energy only on a scale familiar to human experience, including the behavior of astronomical bodies such as the moon. Classical physics is still used in much of modern science and technology. However, towards the end of the 19th century, scientists discovered phenomena in both the large (macro) and the small (micro) worlds that classical physics could not explain. The desire to resolve inconsistencies between observed phenomena and classical theory led to a revolution in physics, a shift in the original scientific paradigm: the development of quantum mechanics.
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not allow an interpretation with local realism. Quantum nonlocality has been experimentally verified under a variety of physical assumptions.
In quantum mechanics, superdeterminism is a loophole in Bell's theorem. By postulating that all systems being measured are correlated with the choices of which measurements to make on them, the assumptions of the theorem are no longer fulfilled. A hidden variables theory which is superdeterministic can thus fulfill Bell's notion of local causality and still violate the inequalities derived from Bell's theorem. This makes it possible to construct a local hidden-variable theory that reproduces the predictions of quantum mechanics, for which a few toy models have been proposed. In addition to being deterministic, superdeterministic models also postulate correlations between the state that is measured and the measurement setting.
The Leggett–Garg inequality, named for Anthony James Leggett and Anupam Garg, is a mathematical inequality fulfilled by all macrorealistic physical theories. Here, macrorealism is a classical worldview defined by the conjunction of two postulates:
Quantum secret sharing (QSS) is a quantum cryptographic scheme for secure communication that extends beyond simple quantum key distribution. It modifies the classical secret sharing (CSS) scheme by using quantum information and the no-cloning theorem to attain the ultimate security for communications.