Superdeterminism

Last updated

In quantum mechanics, superdeterminism is a loophole in Bell's theorem. By postulating that all systems being measured are correlated with the choices of which measurements to make on them, the assumptions of the theorem are no longer fulfilled. A hidden variables theory which is superdeterministic can thus fulfill Bell's notion of local causality and still violate the inequalities derived from Bell's theorem. [1] This makes it possible to construct a local hidden-variable theory that reproduces the predictions of quantum mechanics, for which a few toy models have been proposed. [2] [3] [4] In addition to being deterministic, superdeterministic models also postulate correlations between the state that is measured and the measurement setting.

Contents

Overview

Bell's theorem assumes that the measurements performed at each detector can be chosen independently of each other and of the hidden variables that determine the measurement outcome. This relation is often referred to as measurement independence or statistical independence. In a superdeterministic theory this relation is not fulfilled; the hidden variables are necessarily correlated with the measurement setting. Since the choice of measurements and the hidden variable are predetermined, the results at one detector can depend on which measurement is done at the other without any need for information to travel faster than the speed of light. The assumption of statistical independence is sometimes referred to as the free choice or free will assumption, since its negation implies that human experimentalists are not free to choose which measurement to perform.

It is possible to test restricted versions of superdeterminism that posit that the correlations between the hidden variables and the choice of measurement have been established in the recent past. [5] In general, though, superdeterminism is fundamentally untestable, as the correlations can be postulated to exist since the Big Bang, making the loophole impossible to eliminate. [6]

A hypothetical depiction of superdeterminism in which photons from the distant galaxies Sb and Sc are used to control the orientation of the polarization detectors a and b just prior to the arrival of entangled photons Alice and Bob. Bell's theorem and superdeterminism.svg
A hypothetical depiction of superdeterminism in which photons from the distant galaxies Sb and Sc are used to control the orientation of the polarization detectors α and β just prior to the arrival of entangled photons Alice and Bob.

In the 1980s, John Stewart Bell discussed superdeterminism in a BBC interview: [7] [8]

There is a way to escape the inference of superluminal speeds and spooky action at a distance. But it involves absolute determinism in the universe, the complete absence of free will. Suppose the world is super-deterministic, with not just inanimate nature running on behind-the-scenes clockwork, but with our behavior, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined, including the "decision" by the experimenter to carry out one set of measurements rather than another, the difficulty disappears. There is no need for a faster than light signal to tell particle A what measurement has been carried out on particle B, because the universe, including particle A, already "knows" what that measurement, and its outcome, will be.

Although he acknowledged the loophole, he also argued that it was implausible. Even if the measurements performed are chosen by deterministic random number generators, the choices can be assumed to be "effectively free for the purpose at hand," because the machine's choice is altered by a large number of very small effects. It is unlikely for the hidden variable to be sensitive to all of the same small influences that the random number generator was. [9]

Nobel Prize in Physics winner Gerard 't Hooft discussed this loophole with John Bell in the early 1980s:

I raised the question: Suppose that also Alice's and Bob's decisions have to be seen as not coming out of free will, but being determined by everything in the theory. John said, well, you know, that I have to exclude. If it's possible, then what I said doesn't apply. I said, Alice and Bob are making a decision out of a cause. A cause lies in their past and has to be included in the picture". [10]

According to the physicist Anton Zeilinger, if superdeterminism is true, some of its implications would bring into question the value of science itself by destroying falsifiability:

[W]e always implicitly assume the freedom of the experimentalist... This fundamental assumption is essential to doing science. If this were not true, then, I suggest, it would make no sense at all to ask nature questions in an experiment, since then nature could determine what our questions are, and that could guide our questions such that we arrive at a false picture of nature. [11]

Physicists Sabine Hossenfelder and Tim Palmer have argued that superdeterminism "is a promising approach not only to solve the measurement problem, but also to understand the apparent non-locality of quantum physics". [12]

Howard M. Wiseman and Eric Cavalcanti argue that any hypothetical superdeterministic theory "would be about as plausible, and appealing, as belief in ubiquitous alien mind-control". [13]

Examples

The first superdeterministic hidden variables model was put forward by Carl H. Brans in 1988. [2] Other models were proposed in 2010 by Michael Hall, [3] and in 2022 by Donadi and Hossenfelder. [4] Gerard 't Hooft has referred to his cellular automaton model of quantum mechanics as superdeterministic [14] though it has remained unclear whether it fulfills the definition.

Some authors consider retrocausality in quantum mechanics to be an example of superdeterminism, whereas other authors treat the two cases as distinct. [15] No agreed-upon definition for distinguishing them exists.

See also

Related Research Articles

<span class="mw-page-title-main">Einstein–Podolsky–Rosen paradox</span> Historical critique of quantum mechanics

The Einstein–Podolsky–Rosen (EPR) paradox is a thought experiment proposed by physicists Albert Einstein, Boris Podolsky and Nathan Rosen which argues that the description of physical reality provided by quantum mechanics is incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing these hidden variables. Resolutions of the paradox have important implications for the interpretation of quantum mechanics.

<span class="mw-page-title-main">Quantum entanglement</span> Correlation between quantum systems

Quantum entanglement is the phenomenon that occurs when a duet of particles are generated, interact, or share spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.

In quantum mechanics, counterfactual definiteness (CFD) is the ability to speak "meaningfully" of the definiteness of the results of measurements that have not been performed. The term "counterfactual definiteness" is used in discussions of physics calculations, especially those related to the phenomenon called quantum entanglement and those related to the Bell inequalities. In such discussions "meaningfully" means the ability to treat these unmeasured results on an equal footing with measured results in statistical calculations. It is this aspect of counterfactual definiteness that is of direct relevance to physics and mathematical models of physical systems and not philosophical concerns regarding the meaning of unmeasured results.

Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are putative properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."

In philosophy, philosophy of physics deals with conceptual and interpretational issues in modern physics, many of which overlap with research done by certain kinds of theoretical physicists. Philosophy of physics can be broadly divided into three areas:

<span class="mw-page-title-main">John Stewart Bell</span> Northern Irish physicist (1928–1990)

John Stewart Bell FRS was a physicist from Northern Ireland and the originator of Bell's theorem, an important theorem in quantum physics regarding hidden-variable theories.

In physics, a hidden-variable theory is a deterministic physical model which seeks to explain the probabilistic nature of quantum mechanics by introducing additional variables.

In physics, the principle of locality states that an object is influenced directly only by its immediate surroundings. A theory that includes the principle of locality is said to be a "local theory". This is an alternative to the concept of instantaneous, or "non-local" action at a distance. Locality evolved out of the field theories of classical physics. The idea is that for a cause at one point to have an effect at another point, something in the space between those points must mediate the action. To exert an influence, something, such as a wave or particle, must travel through the space between the two points, carrying the influence.

In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.

A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. Named for John Stewart Bell, the experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables to explain the behavior of particles like photons and electrons. As of 2015, all Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems behave.

In the interpretation of quantum mechanics, a local hidden-variable theory is a hidden-variable theory that satisfies the principle of locality. These models attempt to account for the probabilistic features of quantum mechanics via the mechanism of underlying, but inaccessible variables, with the additional requirement that distant events be statistically independent.

The free will theorem of John H. Conway and Simon B. Kochen states that if we have a free will in the sense that our choices are not a function of the past, then, subject to certain assumptions, so must some elementary particles. Conway and Kochen's paper was published in Foundations of Physics in 2006. In 2009, the authors published a stronger version of the theorem in the Notices of the American Mathematical Society. Later, in 2017, Kochen elaborated some details.

In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–Kochen–Specker theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the permissible types of hidden-variable theories, which try to explain the predictions of quantum mechanics in a context-independent way. The version of the theorem proved by Kochen and Specker also gave an explicit example for this constraint in terms of a finite number of state vectors.

<span class="mw-page-title-main">Greenberger–Horne–Zeilinger state</span> "Highly entangled" quantum state of 3 or more qubits

In physics, in the area of quantum information theory, a Greenberger–Horne–Zeilinger state is a certain type of entangled quantum state that involves at least three subsystems. The four-particle version was first studied by Daniel Greenberger, Michael Horne and Anton Zeilinger in 1989, and the three-particle version was introduced by N. David Mermin in 1990. Extremely non-classical properties of the state have been observed. GHZ states for large numbers of qubits are theorized to give enhanced performance for metrology compared to other qubit superposition states.

In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.

In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not allow an interpretation with local realism. Quantum nonlocality has been experimentally verified under a variety of physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore cannot fulfill local realism; quantum nonlocality is a property of the universe that is independent of our description of nature.

The Leggett inequalities, named for Anthony James Leggett, who derived them, are a related pair of mathematical expressions concerning the correlations of properties of entangled particles. They are fulfilled by a large class of physical theories based on particular non-local and realistic assumptions, that may be considered to be plausible or intuitive according to common physical reasoning.

Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.

<span class="mw-page-title-main">Aspect's experiment</span> Quantum mechanics experiment

Aspect's experiment was the first quantum mechanics experiment to demonstrate the violation of Bell's inequalities with photons using distant detectors. Its 1982 result allowed for further validation of the quantum entanglement and locality principles. It also offered an experimental answer to Albert Einstein, Boris Podolsky, and Nathan Rosen's paradox which had been proposed about fifty years earlier.

Quantum Theory: Concepts and Methods is a 1993 quantum physics textbook by Israeli physicist Asher Peres. Well-regarded among the physics community, it is known for unconventional choices of topics to include.

References

  1. Larsson, Jan-Åke (2014). "Loopholes in Bell inequality tests of local realism". Journal of Physics A: Mathematical and Theoretical. 47 (42): 16. arXiv: 1407.0363 . Bibcode:2014JPhA...47P4003L. doi:10.1088/1751-8113/47/42/424003. S2CID   40332044.
  2. 1 2 Brans, Carl H. (1988-02-01). "Bell's theorem does not eliminate fully causal hidden variables". International Journal of Theoretical Physics. 27 (2): 219–226. doi:10.1007/BF00670750. ISSN   1572-9575. S2CID   121627152.
  3. 1 2 Hall, Michael J. W. (2010-12-16). "Local Deterministic Model of Singlet State Correlations Based on Relaxing Measurement Independence". Physical Review Letters. 105 (25): 250404. doi:10.1103/PhysRevLett.105.250404. hdl: 10072/42810 . PMID   21231566. S2CID   45436471.
  4. 1 2 Donadi, Sandro; Hossenfelder, Sabine (2022-08-19). "Toy model for local and deterministic wave-function collapse". Physical Review A. 106 (2): 022212. arXiv: 2010.01327 . doi:10.1103/PhysRevA.106.022212. S2CID   237260229.
  5. Thomas Scheidl; Rupert Ursin; Johannes Kofler; Sven Ramelow; Xiao-Song Ma; Thomas Herbst; Lothar Ratschbacher; Alessandro Fedrizzi; Nathan K. Langford; Thomas Jennewein; Anton Zeilinger; et al. (2010). "Violation of local realism with freedom of choice". Proc. Natl. Acad. Sci. 107 (46): 19708–19713. arXiv: 0811.3129 . Bibcode:2010PNAS..10719708S. doi: 10.1073/pnas.1002780107 . PMC   2993398 . PMID   21041665.
  6. Wolchover, Natalie. "The Universe Is as Spooky as Einstein Thought". The Atlantic. Retrieved 2017-02-20.
  7. BBC Radio interview with Paul Davies, 1985
  8. The quotation is an adaptation from the edited transcript of the radio interview with John Bell of 1985. See The Ghost in the Atom: A Discussion of the Mysteries of Quantum Physics, by Paul C. W. Davies and Julian R. Brown, 1986/1993, pp. 45-46
  9. J. S. Bell, Free variables and local causality, Epistemological Letters, Feb. 1977. Reprinted as Chapter 12 of J. S. Bell, Speakable and Unspeakable in Quantum Mechanics (Cambridge University Press 1987)
  10. Musser, George (7 October 2013). "Does Some Deeper Level of Physics Underlie Quantum Mechanics? An Interview with Nobelist Gerard 't Hooft".
  11. A. Zeilinger, Dance of the Photons, Farrar, Straus and Giroux, New York, 2010, p. 266. Abner Shimony, Michael Horne and John Clauser made a similar comment in replying to John Bell in their discussions in the Epistemological Letters : "In any scientific experiment in which two or more variables are supposed to be randomly selected, one can always conjecture that some factor in the overlap of the backward light cones has controlled the presumably random choices. But, we maintain, skepticism of this sort will essentially dismiss all results of scientific experimentation. Unless we proceed under the assumption that hidden conspiracies of this sort do not occur, we have abandoned in advance the whole enterprise of discovering the laws of nature by experimentation." (Shimony A, Horne M A and Clauser J F, "Comment on the theory of local beables", Epistemological Letters, 13 1 (1976), as quoted in Jan-Åke Larsson, "Loopholes in Bell inequality tests of local realism", J. Phys. A: Math. Theor. 47 (2014))
  12. Hossenfelder, Sabine; Palmer, Tim (2020). "Rethinking Superdeterminism". Frontiers in Physics. 8: 139. arXiv: 1912.06462 . Bibcode:2020FrP.....8..139P. doi: 10.3389/fphy.2020.00139 . ISSN   2296-424X.
  13. Wiseman, Howard; Cavalcanti, Eric (2016). "Causarum Investigatio and the Two Bell's Theorems of John Bell". In R. Bertlmann; A. Zeilinger (eds.). Quantum [Un]Speakables II. The Frontiers Collection. Springer. pp. 119–142. arXiv: 1503.06413 . doi:10.1007/978-3-319-38987-5_6. ISBN   978-3-319-38985-1.
  14. 't Hooft, Gerard (2016). The Cellular Automaton Interpretation of Quantum Mechanics. Fundamental Theories of Physics. Vol. 185. doi:10.1007/978-3-319-41285-6. ISBN   978-3-319-41284-9. S2CID   7779840.
  15. Wharton, K. B.; Argaman, N. (2020-05-18). "Colloquium: Bell's theorem and locally mediated reformulations of quantum mechanics". Reviews of Modern Physics. 92 (2): 021002. arXiv: 1906.04313 . doi:10.1103/RevModPhys.92.021002. S2CID   184486977.