Part of a series of articles about |
Quantum mechanics |
---|
In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, the most prominent of which is QBism (pronounced "cubism"). QBism is an interpretation that takes an agent's actions and experiences as the central concerns of the theory. QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. [1] [2] According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead, it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. [3] [4] The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it. [5] [6]
This interpretation is distinguished by its use of a subjective Bayesian account of probabilities to understand the quantum mechanical Born rule as a normative addition to good decision-making. Rooted in the prior work of Carlton Caves, Christopher Fuchs, and Rüdiger Schack during the early 2000s, QBism itself is primarily associated with Fuchs and Schack and has more recently been adopted by David Mermin. [7] QBism draws from the fields of quantum information and Bayesian probability and aims to eliminate the interpretational conundrums that have beset quantum theory. The QBist interpretation is historically derivative of the views of the various physicists that are often grouped together as "the" Copenhagen interpretation, [8] [9] but is itself distinct from them. [9] [10] Theodor Hänsch has characterized QBism as sharpening those older views and making them more consistent. [11]
More generally, any work that uses a Bayesian or personalist (a.k.a. "subjective") treatment of the probabilities that appear in quantum theory is also sometimes called quantum Bayesian. QBism, in particular, has been referred to as "the radical Bayesian interpretation". [12]
In addition to presenting an interpretation of the existing mathematical structure of quantum theory, some QBists have advocated a research program of reconstructing quantum theory from basic physical principles whose QBist character is manifest. The ultimate goal of this research is to identify what aspects of the ontology of the physical world make quantum theory a good tool for agents to use. [13] However, the QBist interpretation itself, as described in § Core positions, does not depend on any particular reconstruction.
E. T. Jaynes, a promoter of the use of Bayesian probability in statistical physics, once suggested that quantum theory is "[a] peculiar mixture describing in part realities of Nature, in part incomplete human information about Nature—all scrambled up by Heisenberg and Bohr into an omelette that nobody has seen how to unscramble". [15] QBism developed out of efforts to separate these parts using the tools of quantum information theory and personalist Bayesian probability theory.
There are many interpretations of probability theory. Broadly speaking, these interpretations fall into one of three categories: those which assert that a probability is an objective property of reality (the propensity school), those who assert that probability is an objective property of the measuring process (frequentists), and those which assert that a probability is a cognitive construct which an agent may use to quantify their ignorance or degree of belief in a proposition (Bayesians). QBism begins by asserting that all probabilities, even those appearing in quantum theory, are most properly viewed as members of the latter category. Specifically, QBism adopts a personalist Bayesian interpretation along the lines of Italian mathematician Bruno de Finetti [16] and English philosopher Frank Ramsey. [17] [18]
According to QBists, the advantages of adopting this view of probability are twofold. First, for QBists the role of quantum states, such as the wavefunctions of particles, is to efficiently encode probabilities; so quantum states are ultimately degrees of belief themselves. (If one considers any single measurement that is a minimal, informationally complete positive operator-valued measure (POVM), this is especially clear: A quantum state is mathematically equivalent to a single probability distribution, the distribution over the possible outcomes of that measurement. [19] ) Regarding quantum states as degrees of belief implies that the event of a quantum state changing when a measurement occurs—the "collapse of the wave function"—is simply the agent updating her beliefs in response to a new experience. [13] Second, it suggests that quantum mechanics can be thought of as a local theory, because the Einstein–Podolsky–Rosen (EPR) criterion of reality can be rejected. The EPR criterion states: "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of reality corresponding to that quantity." [20] Arguments that quantum mechanics should be considered a nonlocal theory depend upon this principle, but to a QBist, it is invalid, because a personalist Bayesian considers all probabilities, even those equal to unity, to be degrees of belief. [21] [22] Therefore, while many interpretations of quantum theory conclude that quantum mechanics is a nonlocal theory, QBists do not. [23]
Christopher Fuchs introduced the term "QBism" and outlined the interpretation in more or less its present form in 2010, [24] carrying further and demanding consistency of ideas broached earlier, notably in publications from 2002. [25] [26] Several subsequent works have expanded and elaborated upon these foundations, notably a Reviews of Modern Physics article by Fuchs and Schack; [19] an American Journal of Physics article by Fuchs, Mermin, and Schack; [23] and Enrico Fermi Summer School [27] lecture notes by Fuchs and Stacey. [22]
Prior to the 2010 article, the term "quantum Bayesianism" was used to describe the developments which have since led to QBism in its present form. However, as noted above, QBism subscribes to a particular kind of Bayesianism which does not suit everyone who might apply Bayesian reasoning to quantum theory (see, for example, § Other uses of Bayesian probability in quantum physics below). Consequently, Fuchs chose to call the interpretation "QBism", pronounced "cubism", preserving the Bayesian spirit via the CamelCase in the first two letters, but distancing it from Bayesianism more broadly. As this neologism is a homophone of Cubism the art movement, it has motivated conceptual comparisons between the two, [28] and media coverage of QBism has been illustrated with art by Picasso [7] and Gris. [29] However, QBism itself was not influenced or motivated by Cubism and has no lineage to a potential connection between Cubist art and Bohr's views on quantum theory. [30]
According to QBism, quantum theory is a tool which an agent may use to help manage their expectations, more like probability theory than a conventional physical theory. [13] Quantum theory, QBism claims, is fundamentally a guide for decision making which has been shaped by some aspects of physical reality. Chief among the tenets of QBism are the following: [31]
Reactions to the QBist interpretation have ranged from enthusiastic [13] [28] to strongly negative. [32] Some who have criticized QBism claim that it fails to meet the goal of resolving paradoxes in quantum theory. Bacciagaluppi argues that QBism's treatment of measurement outcomes does not ultimately resolve the issue of nonlocality, [33] and Jaeger finds QBism's supposition that the interpretation of probability is key for the resolution to be unnatural and unconvincing. [12] Norsen [34] has accused QBism of solipsism, and Wallace [35] identifies QBism as an instance of instrumentalism; QBists have argued insistently that these characterizations are misunderstandings, and that QBism is neither solipsist nor instrumentalist. [17] [36] A critical article by Nauenberg [32] in the American Journal of Physics prompted a reply by Fuchs, Mermin, and Schack. [37]
Some assert that there may be inconsistencies; for example, Stairs argues that when a probability assignment equals one, it cannot be a degree of belief as QBists say. [38] Further, while also raising concerns about the treatment of probability-one assignments, Timpson suggests that QBism may result in a reduction of explanatory power as compared to other interpretations. [1] Fuchs and Schack replied to these concerns in a later article. [39] Mermin advocated QBism in a 2012 Physics Today article, [2] which prompted considerable discussion. Several further critiques of QBism which arose in response to Mermin's article, and Mermin's replies to these comments, may be found in the Physics Today readers' forum. [40] [41] Section 2 of the Stanford Encyclopedia of Philosophy entry on QBism also contains a summary of objections to the interpretation, and some replies. [42] Others are opposed to QBism on more general philosophical grounds; for example, Mohrhoff criticizes QBism from the standpoint of Kantian philosophy. [43]
Certain authors find QBism internally self-consistent, but do not subscribe to the interpretation. [44] For example, Marchildon finds QBism well-defined in a way that, to him, many-worlds interpretations are not, but he ultimately prefers a Bohmian interpretation. [45] Similarly, Schlosshauer and Claringbold state that QBism is a consistent interpretation of quantum mechanics, but do not offer a verdict on whether it should be preferred. [46] In addition, some agree with most, but perhaps not all, of the core tenets of QBism; Barnum's position, [47] as well as Appleby's, [48] are examples.
Popularized or semi-popularized media coverage of QBism has appeared in New Scientist, [49] Scientific American , [50] Nature, [51] Science News , [52] the FQXi Community, [53] the Frankfurter Allgemeine Zeitung , [29] Quanta Magazine , [16] Aeon, [54] Discover, [55] Nautilus Quarterly, [56] and Big Think . [57] In 2018, two popular-science books about the interpretation of quantum mechanics, Ball's Beyond Weird and Ananthaswamy's Through Two Doors at Once, devoted sections to QBism. [58] [59] Furthermore, Harvard University Press published a popularized treatment of the subject, QBism: The Future of Quantum Physics, in 2016. [13]
The philosophy literature has also discussed QBism from the viewpoints of structural realism and of phenomenology. [60] [61] [62] Ballentine argues that "the initial assumption of QBism is not valid" because the inferential probability of Bayesian theory used by QBism is not applicable to quantum mechanics. [63]
The views of many physicists (Bohr, Heisenberg, Rosenfeld, von Weizsäcker, Peres, etc.) are often grouped together as the "Copenhagen interpretation" of quantum mechanics. Several authors have deprecated this terminology, claiming that it is historically misleading and obscures differences between physicists that are as important as their similarities. [14] [64] QBism shares many characteristics in common with the ideas often labeled as "the Copenhagen interpretation", but the differences are important; to conflate them or to regard QBism as a minor modification of the points of view of Bohr or Heisenberg, for instance, would be a substantial misrepresentation. [10] [31]
QBism takes probabilities to be personal judgments of the individual agent who is using quantum mechanics. This contrasts with older Copenhagen-type views, which hold that probabilities are given by quantum states that are in turn fixed by objective facts about preparation procedures. [13] [65] QBism considers a measurement to be any action that an agent takes to elicit a response from the world and the outcome of that measurement to be the experience the world's response induces back on that agent. As a consequence, communication between agents is the only means by which different agents can attempt to compare their internal experiences. Most variants of the Copenhagen interpretation, however, hold that the outcomes of experiments are agent-independent pieces of reality for anyone to access. [10] QBism claims that these points on which it differs from previous Copenhagen-type interpretations resolve the obscurities that many critics have found in the latter, by changing the role that quantum theory plays (even though QBism does not yet provide a specific underlying ontology). Specifically, QBism posits that quantum theory is a normative tool which an agent may use to better navigate reality, rather than a set of mechanics governing it. [22] [42]
Approaches to quantum theory, like QBism, [66] which treat quantum states as expressions of information, knowledge, belief, or expectation are called "epistemic" interpretations. [6] These approaches differ from each other in what they consider quantum states to be information or expectations "about", as well as in the technical features of the mathematics they employ. Furthermore, not all authors who advocate views of this type propose an answer to the question of what the information represented in quantum states concerns. In the words of the paper that introduced the Spekkens Toy Model:
if a quantum state is a state of knowledge, and it is not knowledge of local and noncontextual hidden variables, then what is it knowledge about? We do not at present have a good answer to this question. We shall therefore remain completely agnostic about the nature of the reality to which the knowledge represented by quantum states pertains. This is not to say that the question is not important. Rather, we see the epistemic approach as an unfinished project, and this question as the central obstacle to its completion. Nonetheless, we argue that even in the absence of an answer to this question, a case can be made for the epistemic view. The key is that one can hope to identify phenomena that are characteristic of states of incomplete knowledge regardless of what this knowledge is about. [67]
Leifer and Spekkens propose a way of treating quantum probabilities as Bayesian probabilities, thereby considering quantum states as epistemic, which they state is "closely aligned in its philosophical starting point" with QBism. [68] However, they remain deliberately agnostic about what physical properties or entities quantum states are information (or beliefs) about, as opposed to QBism, which offers an answer to that question. [68] Another approach, advocated by Bub and Pitowsky, argues that quantum states are information about propositions within event spaces that form non-Boolean lattices. [69] On occasion, the proposals of Bub and Pitowsky are also called "quantum Bayesianism". [70]
Zeilinger and Brukner have also proposed an interpretation of quantum mechanics in which "information" is a fundamental concept, and in which quantum states are epistemic quantities. [71] Unlike QBism, the Brukner–Zeilinger interpretation treats some probabilities as objectively fixed. In the Brukner–Zeilinger interpretation, a quantum state represents the information that a hypothetical observer in possession of all possible data would have. Put another way, a quantum state belongs in their interpretation to an optimally informed agent, whereas in QBism, any agent can formulate a state to encode her own expectations. [72] Despite this difference, in Cabello's classification, the proposals of Zeilinger and Brukner are also designated as "participatory realism", as QBism and the Copenhagen-type interpretations are. [6]
Bayesian, or epistemic, interpretations of quantum probabilities were proposed in the early 1990s by Baez and Youssef. [73] [74]
R. F. Streater argued that "[t]he first quantum Bayesian was von Neumann", basing that claim on von Neumann's textbook The Mathematical Foundations of Quantum Mechanics . [75] Blake Stacey disagrees, arguing that the views expressed in that book on the nature of quantum states and the interpretation of probability are not compatible with QBism, or indeed, with any position that might be called quantum Bayesianism. [14]
Comparisons have also been made between QBism and the relational quantum mechanics (RQM) espoused by Carlo Rovelli and others. [76] [77] In both QBism and RQM, quantum states are not intrinsic properties of physical systems. [78] Both QBism and RQM deny the existence of an absolute, universal wavefunction. Furthermore, both QBism and RQM insist that quantum mechanics is a fundamentally local theory. [23] [79] In addition, Rovelli, like several QBist authors, advocates reconstructing quantum theory from physical principles in order to bring clarity to the subject of quantum foundations. [80] (The QBist approaches to doing so are different from Rovelli's, and are described below.) One important distinction between the two interpretations is their philosophy of probability: RQM does not adopt the Ramsey–de Finetti school of personalist Bayesianism. [6] [17] Moreover, RQM does not insist that a measurement outcome is necessarily an agent's experience. [17]
QBism should be distinguished from other applications of Bayesian inference in quantum physics, and from quantum analogues of Bayesian inference. [19] [73] For example, some in the field of computer science have introduced a kind of quantum Bayesian network, which they argue could have applications in "medical diagnosis, monitoring of processes, and genetics". [81] [82] Bayesian inference has also been applied in quantum theory for updating probability densities over quantum states, [83] and MaxEnt methods have been used in similar ways. [73] [84] Bayesian methods for quantum state and process tomography are an active area of research. [85]
Conceptual concerns about the interpretation of quantum mechanics and the meaning of probability have motivated technical work. A quantum version of the de Finetti theorem, introduced by Caves, Fuchs, and Schack (independently reproving a result found using different means by Størmer [86] ) to provide a Bayesian understanding of the idea of an "unknown quantum state", [87] [88] has found application elsewhere, in topics like quantum key distribution [89] and entanglement detection. [90]
Adherents of several interpretations of quantum mechanics, QBism included, have been motivated to reconstruct quantum theory. The goal of these research efforts has been to identify a new set of axioms or postulates from which the mathematical structure of quantum theory can be derived, in the hope that with such a reformulation, the features of nature which made quantum theory the way it is might be more easily identified. [51] [91] Although the core tenets of QBism do not demand such a reconstruction, some QBists—Fuchs, [26] in particular—have argued that the task should be pursued.
One topic prominent in the reconstruction effort is the set of mathematical structures known as symmetric, informationally-complete, positive operator-valued measures (SIC-POVMs). QBist foundational research stimulated interest in these structures, which now have applications in quantum theory outside of foundational studies [92] and in pure mathematics. [93]
The most extensively explored QBist reformulation of quantum theory involves the use of SIC-POVMs to rewrite quantum states (either pure or mixed) as a set of probabilities defined over the outcomes of a "Bureau of Standards" measurement. [94] [95] That is, if one expresses a density matrix as a probability distribution over the outcomes of a SIC-POVM experiment, one can reproduce all the statistical predictions implied by the density matrix from the SIC-POVM probabilities instead. [96] The Born rule then takes the role of relating one valid probability distribution to another, rather than of deriving probabilities from something apparently more fundamental. Fuchs, Schack, and others have taken to calling this restatement of the Born rule the urgleichung, from the German for "primal equation" (see Ur- prefix), because of the central role it plays in their reconstruction of quantum theory. [19] [97] [98]
The following discussion presumes some familiarity with the mathematics of quantum information theory, and in particular, the modeling of measurement procedures by POVMs. Consider a quantum system to which is associated a -dimensional Hilbert space. If a set of rank-1 projectors satisfyingexists, then one may form a SIC-POVM . An arbitrary quantum state may be written as a linear combination of the SIC projectorswhere is the Born rule probability for obtaining SIC measurement outcome implied by the state assignment . We follow the convention that operators have hats while experiences (that is, measurement outcomes) do not. Now consider an arbitrary quantum measurement, denoted by the POVM . The urgleichung is the expression obtained from forming the Born rule probabilities, , for the outcomes of this quantum measurement, where is the Born rule probability for obtaining outcome implied by the state assignment . The term may be understood to be a conditional probability in a cascaded measurement scenario: Imagine that an agent plans to perform two measurements, first a SIC measurement and then the measurement. After obtaining an outcome from the SIC measurement, the agent will update her state assignment to a new quantum state before performing the second measurement. If she uses the Lüders rule [99] for state update and obtains outcome from the SIC measurement, then . Thus the probability for obtaining outcome for the second measurement conditioned on obtaining outcome for the SIC measurement is .
Note that the urgleichung is structurally very similar to the law of total probability, which is the expressionThey functionally differ only by a dimension-dependent affine transformation of the SIC probability vector. As QBism says that quantum theory is an empirically-motivated normative addition to probability theory, Fuchs and others find the appearance of a structure in quantum theory analogous to one in probability theory to be an indication that a reformulation featuring the urgleichung prominently may help to reveal the properties of nature which made quantum theory so successful. [19] [22]
The urgleichung does not replace the law of total probability. Rather, the urgleichung and the law of total probability apply in different scenarios because and refer to different situations. is the probability that an agent assigns for obtaining outcome on her second of two planned measurements, that is, for obtaining outcome after first making the SIC measurement and obtaining one of the outcomes. , on the other hand, is the probability an agent assigns for obtaining outcome when she does not plan to first make the SIC measurement. The law of total probability is a consequence of coherence within the operational context of performing the two measurements as described. The urgleichung, in contrast, is a relation between different contexts which finds its justification in the predictive success of quantum physics.
The SIC representation of quantum states also provides a reformulation of quantum dynamics. Consider a quantum state with SIC representation . The time evolution of this state is found by applying a unitary operator to form the new state , which has the SIC representation
The second equality is written in the Heisenberg picture of quantum dynamics, with respect to which the time evolution of a quantum system is captured by the probabilities associated with a rotated SIC measurement of the original quantum state . Then the Schrödinger equation is completely captured in the urgleichung for this measurement:In these terms, the Schrödinger equation is an instance of the Born rule applied to the passing of time; an agent uses it to relate how she will gamble on informationally complete measurements potentially performed at different times.
Those QBists who find this approach promising are pursuing a complete reconstruction of quantum theory featuring the urgleichung as the key postulate. [97] (The urgleichung has also been discussed in the context of category theory. [100] ) Comparisons between this approach and others not associated with QBism (or indeed with any particular interpretation) can be found in a book chapter by Fuchs and Stacey [101] and an article by Appleby et al. [97] As of 2017, alternative QBist reconstruction efforts are in the beginning stages. [102]
The Copenhagen interpretation is a collection of views about the meaning of quantum mechanics, stemming from the work of Niels Bohr, Werner Heisenberg, Max Born, and others. While "Copenhagen" refers to the Danish city, the use as an "interpretation" was apparently coined by Heisenberg during the 1950s to refer to ideas developed in the 1925–1927 period, glossing over his disagreements with Bohr. Consequently, there is no definitive historical statement of what the interpretation entails.
The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wave function collapse. This implies that all possible outcomes of quantum measurements are physically realized in some "world". The evolution of reality as a whole in MWI is rigidly deterministic and local. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1970s.
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
The de Broglie–Bohm theory is an interpretation of quantum mechanics which postulates that, in addition to the wavefunction, an actual configuration of particles exists, even when unobserved. The evolution over time of the configuration of all particles is defined by a guiding equation. The evolution of the wave function over time is given by the Schrödinger equation. The theory is named after Louis de Broglie (1892–1987) and David Bohm (1917–1992).
An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics might correspond to experienced reality. Quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments. However, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, local or non-local, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are supposed properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
Wigner's friend is a thought experiment in theoretical quantum physics, first published by the Hungarian-American physicist Eugene Wigner in 1961, and further developed by David Deutsch in 1985. The scenario involves an indirect observation of a quantum measurement: An observer observes another observer who performs a quantum measurement on a physical system. The two observers then formulate a statement about the physical system's state after the measurement according to the laws of quantum theory. In the Copenhagen interpretation, the resulting statements of the two observers contradict each other. This reflects a seeming incompatibility of two laws in the Copenhagen interpretation: the deterministic and continuous time evolution of the state of a closed system and the nondeterministic, discontinuous collapse of the state of a system upon measurement. Wigner's friend is therefore directly linked to the measurement problem in quantum mechanics with its famous Schrödinger's cat paradox.
In quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.
The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle's time evolution to be slowed down by measuring it frequently enough with respect to some chosen measurement setting.
In quantum mechanics, the measurement problem is the problem of definite outcomes: quantum systems have superpositions but quantum measurements only give one definite result.
In the interpretation of quantum mechanics, a local hidden-variable theory is a hidden-variable theory that satisfies the principle of locality. These models attempt to account for the probabilistic features of quantum mechanics via the mechanism of underlying, but inaccessible variables, with the additional requirement that distant events be statistically independent.
The Born rule is a postulate of quantum mechanics that gives the probability that a measurement of a quantum system will yield a given result. In one commonly used application, it states that the probability density for finding a particle at a given position is proportional to the square of the amplitude of the system's wavefunction at that position. It was formulated and published by German physicist Max Born in July, 1926.
In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.
Nathaniel David Mermin is a solid-state physicist at Cornell University best known for the eponymous Hohenberg–Mermin–Wagner theorem, his application of the term "boojum" to superfluidity, his textbook with Neil Ashcroft on solid-state physics, and for contributions to the foundations of quantum mechanics and quantum information science.
There is a diversity of views that propose interpretations of quantum mechanics. They vary in how many physicists accept or reject them. An interpretation of quantum mechanics is a conceptual scheme that proposes to relate the mathematical formalism to the physical phenomena of interest. The present article is about those interpretations which, independently of their intrinsic value, remain today less known, or are simply less debated by the scientific community, for different reasons.
The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.
Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.
Quantum Theory: Concepts and Methods is a 1993 quantum physics textbook by Israeli physicist Asher Peres. Well-regarded among the physics community, it is known for unconventional choices of topics to include.