Robert Spekkens | |
---|---|
Nationality | Canadian |
Citizenship | Canadian |
Alma mater | McGill University, University of Toronto |
Known for | Spekkens toy model; quantum contextuality |
Awards | Birkhoff-von Neumann Prize of the International Quantum Structures Association (2008); FQXI Essay contest "Questioning the Foundations: Which of Our Assumptions Are Wrong?" (2012) |
Scientific career | |
Fields | Physicist |
Institutions | Perimeter Institute for Theoretical Physics University of Cambridge University of Waterloo Griffith University |
Robert W. Spekkens is a Canadian theoretical quantum physicist working in the fields of quantum foundations and quantum information. [1] [2] [3]
He is known for his work on epistemic view of quantum states (in particular the Spekkens toy model), [4] [5] [6] quantum contextuality, [7] [8] [9] quantum resource theories [10] [11] and quantum causality. [12] [13]
He co-edited the book Quantum Theory: Informational Foundations and Foils. [14]
Spekkens is a faculty member and the leader of the quantum causal inference initiative at Perimeter Institute for Theoretical Physics. [15] [16] He regularly teaches the course on quantum foundations in the Perimeter Scholars International master's program. [17] [18]
He is an adjunct faculty in the Department of Physics of the University of Waterloo [19] and an adjunct research fellow in the Centre for Quantum Dynamics of Griffith University in Brisbane, Australia. [15]
The Copenhagen interpretation is a collection of views about the meaning of quantum mechanics, principally attributed to Niels Bohr and Werner Heisenberg. It is one of the oldest of numerous proposed interpretations of quantum mechanics, as features of it date to the development of quantum mechanics during 1925–1927, and it remains one of the most commonly taught.
In quantum mechanics, Schrödinger's cat is a thought experiment that illustrates a paradox of quantum superposition. In the thought experiment, a hypothetical cat may be considered simultaneously both alive and dead, while it is unobserved in a closed box, as a result of its fate being linked to a random subatomic event that may or may not occur.
An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics might correspond to experienced reality. Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields can only occur at speeds no greater than the speed of light. "Hidden variables" are hypothetical properties possessed by quantum particles, properties that are undetectable but still affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
In physics, hidden-variable theories are proposals to provide explanations of quantum mechanical phenomena through the introduction of hypothetical entities. The existence of fundamental indeterminacy for some measurements is assumed as part of the mathematical formulation of quantum mechanics; moreover, bounds for indeterminacy can be expressed in a quantitative form by the Heisenberg uncertainty principle. Most hidden-variable theories are attempts to avoid quantum indeterminacy, but possibly at the expense of requiring the existence of nonlocal interactions.
Jean-Pierre Vigier was a French theoretical physicist, known for his work on the foundations of physics, in particular on his stochastic interpretation of quantum physics.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. The predictions that quantum physics makes are in general probabilistic. The mathematical tools for making predictions about what measurement outcomes may occur were developed during the 20th century and make use of linear algebra and functional analysis.
Coherent control is a quantum mechanics-based method for controlling dynamic processes by light. The basic principle is to control quantum interference phenomena, typically by shaping the phase of laser pulses. The basic ideas have proliferated, finding vast application in spectroscopy mass spectra, quantum information processing, laser cooling, ultracold physics and more.
In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.
The Spekkens toy model is a conceptually simple toy hidden-variable theory introduced by Robert Spekkens in 2004, to argue in favour of the epistemic view of quantum mechanics. The model is based on a foundational principle: "If one has maximal knowledge, then for every system, at every time, the amount of knowledge one possesses about the ontic state of the system at that time must equal the amount of knowledge one lacks." This is called the "knowledge balance principle". Within the bounds of this model, many phenomena typically associated with strictly quantum-mechanical effects are present. These include entanglement, noncommutativity of measurements, teleportation, interference, the no-cloning and no-broadcasting theorems, and unsharp measurements. The toy model cannot, however, reproduce quantum nonlocality and quantum contextuality, as it is a local and non-contextual hidden-variable theory.
The invariant set postulate concerns the possible relationship between fractal geometry and quantum mechanics and in particular the hypothesis that the former can assist in resolving some of the challenges posed by the latter. It is underpinned by nonlinear dynamical systems theory and black hole thermodynamics.
The Leggett inequalities, named for Anthony James Leggett, who derived them, are a related pair of mathematical expressions concerning the correlations of properties of entangled particles. They are fulfilled by a large class of physical theories based on particular non-local and realistic assumptions, that may be considered to be plausible or intuitive according to common physical reasoning.
In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, of which the most prominent is QBism. QBism is an interpretation that takes an agent's actions and experiences as the central concerns of the theory. QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.
The PBR theorem is a no-go theorem in quantum foundations due to Matthew Pusey, Jonathan Barrett, and Terry Rudolph in 2012. It has particular significance for how one may interpret the nature of the quantum state.
Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
In quantum mechanics, the cat state, named after Schrödinger's cat, is a quantum state composed of two diametrically opposed conditions at the same time, such as the possibilities that a cat is alive and dead at the same time.
The von Neumann–Wigner interpretation, also described as "consciousness causes collapse", is an interpretation of quantum mechanics in which consciousness is postulated to be necessary for the completion of the process of quantum measurement.
Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.
Applying classical methods of machine learning to the study of quantum systems is the focus of an emergent area of physics research. A basic example of this is quantum state tomography, where a quantum state is learned from measurement. Other examples include learning Hamiltonians, learning quantum phase transitions, and automatically generating new quantum experiments. Classical machine learning is effective at processing large amounts of experimental or calculated data in order to characterize an unknown quantum system, making its application useful in contexts including quantum information theory, quantum technologies development, and computational materials design. In this context, it can be used for example as a tool to interpolate pre-calculated interatomic potentials or directly solving the Schrödinger equation with a variational method.
Ivette Fuentes is a Professor of Quantum Physics at the University of Southampton and Professor of Theoretical Quantum Optics at the University of Vienna. Her work considers fundamental quantum mechanics, quantum optics and astrophysics. She is interested in how quantum information theory can make use of relativistic effects.