Part of a series of articles about |
Quantum mechanics |
---|
In quantum mechanics, the consistent histories or simply "consistent quantum theory" [1] interpretation generalizes the complementarity aspect of the conventional Copenhagen interpretation. The approach is sometimes called decoherent histories [2] and in other work decoherent histories are more specialized. [1]
First proposed by Robert Griffiths in 1984, [3] [4] this interpretation of quantum mechanics is based on a consistency criterion that then allows probabilities to be assigned to various alternative histories of a system such that the probabilities for each history obey the rules of classical probability while being consistent with the Schrödinger equation. In contrast to some interpretations of quantum mechanics, the framework does not include "wavefunction collapse" as a relevant description of any physical process, and emphasizes that measurement theory is not a fundamental ingredient of quantum mechanics. Consistent histories allows predictions related to the state of the universe needed for quantum cosmology. [5]
The interpretation rests on three assumptions:
The third assumption generalizes complementarity and this assumption separates consistent histories from other quantum theory interpretations. [1]
A homogeneous history (here labels different histories) is a sequence of Propositions specified at different moments of time (here labels the times). We write this as:
and read it as "the proposition is true at time and then the proposition is true at time and then". The times are strictly ordered and called the temporal support of the history.
Inhomogeneous histories are multiple-time propositions which cannot be represented by a homogeneous history. An example is the logical OR of two homogeneous histories: .
These propositions can correspond to any set of questions that include all possibilities. Examples might be the three propositions meaning "the electron went through the left slit", "the electron went through the right slit" and "the electron didn't go through either slit". One of the aims of the approach is to show that classical questions such as, "where are my keys?" are consistent. In this case one might use a large number of propositions each one specifying the location of the keys in some small region of space.
Each single-time proposition can be represented by a projection operator acting on the system's Hilbert space (we use "hats" to denote operators). It is then useful to represent homogeneous histories by the time-ordered product of their single-time projection operators. This is the history projection operator (HPO) formalism developed by Christopher Isham and naturally encodes the logical structure of the history propositions.
An important construction in the consistent histories approach is the class operator for a homogeneous history:
The symbol indicates that the factors in the product are ordered chronologically according to their values of : the "past" operators with smaller values of appear on the right side, and the "future" operators with greater values of appear on the left side. This definition can be extended to inhomogeneous histories as well.
Central to the consistent histories is the notion of consistency. A set of histories is consistent (or strongly consistent) if
for all . Here represents the initial density matrix, and the operators are expressed in the Heisenberg picture.
The set of histories is weakly consistent if
for all .
If a set of histories is consistent then probabilities can be assigned to them in a consistent way. We postulate that the probability of history is simply
which obeys the axioms of probability if the histories come from the same (strongly) consistent set.
As an example, this means the probability of " OR " equals the probability of "" plus the probability of "" minus the probability of " AND ", and so forth.
The interpretation based on consistent histories is used in combination with the insights about quantum decoherence. Quantum decoherence implies that irreversible macroscopic phenomena (hence, all classical measurements) render histories automatically consistent, which allows one to recover classical reasoning and "common sense" when applied to the outcomes of these measurements. More precise analysis of decoherence allows (in principle) a quantitative calculation of the boundary between the classical domain and the quantum domain. According to Roland Omnès, [6]
[the] history approach, although it was initially independent of the Copenhagen approach, is in some sense a more elaborate version of it. It has, of course, the advantage of being more precise, of including classical physics, and of providing an explicit logical framework for indisputable proofs. But, when the Copenhagen interpretation is completed by the modern results about correspondence and decoherence, it essentially amounts to the same physics.
[... There are] three main differences:
1. The logical equivalence between an empirical datum, which is a macroscopic phenomenon, and the result of a measurement, which is a quantum property, becomes clearer in the new approach, whereas it remained mostly tacit and questionable in the Copenhagen formulation.
2. There are two apparently distinct notions of probability in the new approach. One is abstract and directed toward logic, whereas the other is empirical and expresses the randomness of measurements. We need to understand their relation and why they coincide with the empirical notion entering into the Copenhagen rules.
3. The main difference lies in the meaning of the reduction rule for 'wave packet collapse'. In the new approach, the rule is valid but no specific effect on the measured object can be held responsible for it. Decoherence in the measuring device is enough.
In order to obtain a complete theory, the formal rules above must be supplemented with a particular Hilbert space and rules that govern dynamics, for example a Hamiltonian.
In the opinion of others [7] this still does not make a complete theory as no predictions are possible about which set of consistent histories will actually occur. In other words, the rules of consistent histories, the Hilbert space, and the Hamiltonian must be supplemented by a set selection rule. However, Robert B. Griffiths holds the opinion that asking the question of which set of histories will "actually occur" is a misinterpretation of the theory; [8] histories are a tool for description of reality, not separate alternate realities.
Proponents of this consistent histories interpretation—such as Murray Gell-Mann, James Hartle, Roland Omnès and Robert B. Griffiths—argue that their interpretation clarifies the fundamental disadvantages of the old Copenhagen interpretation, and can be used as a complete interpretational framework for quantum mechanics.
In Quantum Philosophy , [9] Roland Omnès provides a less mathematical way of understanding this same formalism.
The consistent histories approach can be interpreted as a way of understanding which sets of classical questions can be consistently asked of a single quantum system, and which sets of questions are fundamentally inconsistent, and thus meaningless when asked together. It thus becomes possible to demonstrate formally why it is that the questions which Einstein, Podolsky and Rosen assumed could be asked together, of a single quantum system, simply cannot be asked together. On the other hand, it also becomes possible to demonstrate that classical, logical reasoning often does apply, even to quantum experiments – but we can now be mathematically exact about the limits of classical logic.
The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wave function collapse. This implies that all possible outcomes of quantum measurements are physically realized in different "worlds". The evolution of reality as a whole in MWI is rigidly deterministic and local. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1970s.
The mathematical formulations of quantum mechanics are those mathematical formalisms that permit a rigorous description of quantum mechanics. This mathematical formalism uses mainly a part of functional analysis, especially Hilbert spaces, which are a kind of linear space. Such are distinguished from mathematical formalisms for physics theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces, and operators on these spaces. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues; more precisely as spectral values of linear operators in Hilbert space.
Quantum mechanics is a fundamental theory that describes the behavior of nature at and below the scale of atoms. It is the foundation of all quantum physics, which includes quantum chemistry, quantum field theory, quantum technology, and quantum information science.
An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics might correspond to experienced reality. Quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments. However, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, local or non-local, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.
The Schrödinger equation is a partial differential equation that governs the wave function of a non-relativistic quantum-mechanical system. Its discovery was a significant landmark in the development of quantum mechanics. It is named after Erwin Schrödinger, who postulated the equation in 1925 and published it in 1926, forming the basis for the work that resulted in his Nobel Prize in Physics in 1933.
In quantum mechanics, a density matrix is a matrix that describes an ensemble of physical systems as quantum states. It allows for the calculation of the probabilities of the outcomes of any measurements performed upon the systems of the ensemble using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed ensembles. Mixed ensembles arise in quantum mechanics in two different situations:
In quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
The path integral formulation is a description in quantum mechanics that generalizes the stationary action principle of classical mechanics. It replaces the classical notion of a single, unique classical trajectory for a system with a sum, or functional integral, over an infinity of quantum-mechanically possible trajectories to compute a quantum amplitude.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.
In the mathematical study of logic and the physical analysis of quantum foundations, quantum logic is a set of rules for manipulation of propositions inspired by the structure of quantum theory. The formal system takes as its starting point an observation of Garrett Birkhoff and John von Neumann, that the structure of experimental tests in classical mechanics forms a Boolean algebra, but the structure of experimental tests in quantum mechanics forms a much more complicated structure.
In quantum mechanics, the measurement problem is the problem of definite outcomes: quantum systems have superpositions but quantum measurements only give one definite result.
Roland Omnès was a French physicist and author of several books that aimed to give non-scientists the information required to understand quantum mechanics.
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is
The Born rule is a postulate of quantum mechanics that gives the probability that a measurement of a quantum system will yield a given result. In one commonly used application, it states that the probability density for finding a particle at a given position is proportional to the square of the amplitude of the system's wavefunction at that position. It was formulated and published by German physicist Max Born in July, 1926.
This is a glossary for the terminology applied in the foundations of quantum mechanics and quantum metaphysics, collectively called quantum philosophy, a subfield of philosophy of physics.
This is a glossary for the terminology often encountered in undergraduate quantum mechanics courses.
In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, the most prominent of which is QBism. QBism is an interpretation that takes an agent's actions and experiences as the central concerns of the theory. QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead, it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.
The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.
In theoretical physics, the problem of time is a conceptual conflict between quantum mechanics and general relativity. Quantum mechanics regards the flow of time as universal and absolute, whereas general relativity regards the flow of time as malleable and relative. This problem raises the question of what time really is in a physical sense and whether it is truly a real, distinct phenomenon. It also involves the related question of why time seems to flow in a single direction, despite the fact that no known physical laws at the microscopic level seem to require a single direction.