Consistent histories

Last updated

In quantum mechanics, the consistent histories or simply "consistent quantum theory" [1] interpretation generalizes the complementarity aspect of the conventional Copenhagen interpretation. The approach is sometimes called decoherent histories [2] and in other work decoherent histories are more specialized. [1]

Contents

First proposed by Robert Griffiths in 1984, [3] [4] this interpretation of quantum mechanics is based on a consistency criterion that then allows probabilities to be assigned to various alternative histories of a system such that the probabilities for each history obey the rules of classical probability while being consistent with the Schrödinger equation. In contrast to some interpretations of quantum mechanics, the framework does not include "wavefunction collapse" as a relevant description of any physical process, and emphasizes that measurement theory is not a fundamental ingredient of quantum mechanics. Consistent histories allows predictions related to the state of the universe needed for quantum cosmology. [5]

Key assumptions

The interpretation rests on three assumptions:

  1. states in Hilbert space describe physical objects,
  2. quantum predictions are not deterministic, and
  3. physical systems have no single unique description.

The third assumption generalizes complementarity and this assumption separates consistent histories from other quantum theory interpretations. [1]

Formalism

Histories

A homogeneous history (here labels different histories) is a sequence of Propositions specified at different moments of time (here labels the times). We write this as:

and read it as "the proposition is true at time and then the proposition is true at time and then". The times are strictly ordered and called the temporal support of the history.

Inhomogeneous histories are multiple-time propositions which cannot be represented by a homogeneous history. An example is the logical OR of two homogeneous histories: .

These propositions can correspond to any set of questions that include all possibilities. Examples might be the three propositions meaning "the electron went through the left slit", "the electron went through the right slit" and "the electron didn't go through either slit". One of the aims of the approach is to show that classical questions such as, "where are my keys?" are consistent. In this case one might use a large number of propositions each one specifying the location of the keys in some small region of space.

Each single-time proposition can be represented by a projection operator acting on the system's Hilbert space (we use "hats" to denote operators). It is then useful to represent homogeneous histories by the time-ordered product of their single-time projection operators. This is the history projection operator (HPO) formalism developed by Christopher Isham and naturally encodes the logical structure of the history propositions.

Consistency

An important construction in the consistent histories approach is the class operator for a homogeneous history:

The symbol indicates that the factors in the product are ordered chronologically according to their values of : the "past" operators with smaller values of appear on the right side, and the "future" operators with greater values of appear on the left side. This definition can be extended to inhomogeneous histories as well.

Central to the consistent histories is the notion of consistency. A set of histories is consistent (or strongly consistent) if

for all . Here represents the initial density matrix, and the operators are expressed in the Heisenberg picture.

The set of histories is weakly consistent if

for all .

Probabilities

If a set of histories is consistent then probabilities can be assigned to them in a consistent way. We postulate that the probability of history is simply

which obeys the axioms of probability if the histories come from the same (strongly) consistent set.

As an example, this means the probability of " OR " equals the probability of "" plus the probability of "" minus the probability of " AND ", and so forth.

Interpretation

The interpretation based on consistent histories is used in combination with the insights about quantum decoherence. Quantum decoherence implies that irreversible macroscopic phenomena (hence, all classical measurements) render histories automatically consistent, which allows one to recover classical reasoning and "common sense" when applied to the outcomes of these measurements. More precise analysis of decoherence allows (in principle) a quantitative calculation of the boundary between the classical domain and the quantum domain. According to Roland Omnès, [6]

[the] history approach, although it was initially independent of the Copenhagen approach, is in some sense a more elaborate version of it. It has, of course, the advantage of being more precise, of including classical physics, and of providing an explicit logical framework for indisputable proofs. But, when the Copenhagen interpretation is completed by the modern results about correspondence and decoherence, it essentially amounts to the same physics.

[... There are] three main differences:

1. The logical equivalence between an empirical datum, which is a macroscopic phenomenon, and the result of a measurement, which is a quantum property, becomes clearer in the new approach, whereas it remained mostly tacit and questionable in the Copenhagen formulation.

2. There are two apparently distinct notions of probability in the new approach. One is abstract and directed toward logic, whereas the other is empirical and expresses the randomness of measurements. We need to understand their relation and why they coincide with the empirical notion entering into the Copenhagen rules.

3. The main difference lies in the meaning of the reduction rule for 'wave packet collapse'. In the new approach, the rule is valid but no specific effect on the measured object can be held responsible for it. Decoherence in the measuring device is enough.

In order to obtain a complete theory, the formal rules above must be supplemented with a particular Hilbert space and rules that govern dynamics, for example a Hamiltonian.

In the opinion of others [7] this still does not make a complete theory as no predictions are possible about which set of consistent histories will actually occur. In other words, the rules of consistent histories, the Hilbert space, and the Hamiltonian must be supplemented by a set selection rule. However, Robert B. Griffiths holds the opinion that asking the question of which set of histories will "actually occur" is a misinterpretation of the theory; [8] histories are a tool for description of reality, not separate alternate realities.

Proponents of this consistent histories interpretation—such as Murray Gell-Mann, James Hartle, Roland Omnès and Robert B. Griffiths—argue that their interpretation clarifies the fundamental disadvantages of the old Copenhagen interpretation, and can be used as a complete interpretational framework for quantum mechanics.

In Quantum Philosophy , [9] Roland Omnès provides a less mathematical way of understanding this same formalism.

The consistent histories approach can be interpreted as a way of understanding which sets of classical questions can be consistently asked of a single quantum system, and which sets of questions are fundamentally inconsistent, and thus meaningless when asked together. It thus becomes possible to demonstrate formally why it is that the questions which Einstein, Podolsky and Rosen assumed could be asked together, of a single quantum system, simply cannot be asked together. On the other hand, it also becomes possible to demonstrate that classical, logical reasoning often does apply, even to quantum experiments – but we can now be mathematically exact about the limits of classical logic.

See also

Related Research Articles

<span class="mw-page-title-main">Many-worlds interpretation</span> Interpretation of quantum mechanics

The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wave function collapse. This implies that all possible outcomes of quantum measurements are physically realized in different "worlds". The evolution of reality as a whole in MWI is rigidly deterministic and local. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1970s.

The mathematical formulations of quantum mechanics are those mathematical formalisms that permit a rigorous description of quantum mechanics. This mathematical formalism uses mainly a part of functional analysis, especially Hilbert spaces, which are a kind of linear space. Such are distinguished from mathematical formalisms for physics theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces, and operators on these spaces. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues; more precisely as spectral values of linear operators in Hilbert space.

<span class="mw-page-title-main">Quantum mechanics</span> Description of physical properties at the atomic and subatomic scale

Quantum mechanics is a fundamental theory that describes the behavior of nature at and below the scale of atoms. It is the foundation of all quantum physics, which includes quantum chemistry, quantum field theory, quantum technology, and quantum information science.

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics might correspond to experienced reality. Quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments. However, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, local or non-local, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.

The Schrödinger equation is a partial differential equation that governs the wave function of a non-relativistic quantum-mechanical system. Its discovery was a significant landmark in the development of quantum mechanics. It is named after Erwin Schrödinger, who postulated the equation in 1925 and published it in 1926, forming the basis for the work that resulted in his Nobel Prize in Physics in 1933.

In quantum mechanics, a density matrix is a matrix that describes an ensemble of physical systems as quantum states. It allows for the calculation of the probabilities of the outcomes of any measurements performed upon the systems of the ensemble using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed ensembles. Mixed ensembles arise in quantum mechanics in two different situations:

  1. when the preparation of the systems lead to numerous pure states in the ensemble, and thus one must deal with the statistics of possible preparations, and
  2. when one wants to describe a physical system that is entangled with another, without describing their combined state; this case is typical for a system interacting with some environment. In this case, the density matrix of an entangled system differs from that of an ensemble of pure states that, combined, would give the same statistical results upon measurement.

In quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.

<span class="mw-page-title-main">Quantum decoherence</span> Loss of quantum coherence

Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.

<span class="mw-page-title-main">Path integral formulation</span> Formulation of quantum mechanics

The path integral formulation is a description in quantum mechanics that generalizes the stationary action principle of classical mechanics. It replaces the classical notion of a single, unique classical trajectory for a system with a sum, or functional integral, over an infinity of quantum-mechanically possible trajectories to compute a quantum amplitude.

In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.

In the mathematical study of logic and the physical analysis of quantum foundations, quantum logic is a set of rules for manip­ulation of propositions inspired by the structure of quantum theory. The formal system takes as its starting point an obs­ervation of Garrett Birkhoff and John von Neumann, that the structure of experimental tests in classical mechanics forms a Boolean algebra, but the structure of experimental tests in quantum mechanics forms a much more complicated structure.

In quantum mechanics, the measurement problem is the problem of definite outcomes: quantum systems have superpositions but quantum measurements only give one definite result.

Roland Omnès was a French physicist and author of several books that aimed to give non-scientists the information required to understand quantum mechanics.

In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is

The Born rule is a postulate of quantum mechanics that gives the probability that a measurement of a quantum system will yield a given result. In one commonly used application, it states that the probability density for finding a particle at a given position is proportional to the square of the amplitude of the system's wavefunction at that position. It was formulated and published by German physicist Max Born in July, 1926.

This is a glossary for the terminology applied in the foundations of quantum mechanics and quantum metaphysics, collectively called quantum philosophy, a subfield of philosophy of physics.

This is a glossary for the terminology often encountered in undergraduate quantum mechanics courses.

<span class="mw-page-title-main">Quantum Bayesianism</span> Interpretation of quantum mechanics

In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, the most prominent of which is QBism. QBism is an interpretation that takes an agent's actions and experiences as the central concerns of the theory. QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead, it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.

The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.

In theoretical physics, the problem of time is a conceptual conflict between quantum mechanics and general relativity. Quantum mechanics regards the flow of time as universal and absolute, whereas general relativity regards the flow of time as malleable and relative. This problem raises the question of what time really is in a physical sense and whether it is truly a real, distinct phenomenon. It also involves the related question of why time seems to flow in a single direction, despite the fact that no known physical laws at the microscopic level seem to require a single direction.

References

  1. 1 2 3 Hohenberg, P. C. (2010-10-05). "Colloquium : An introduction to consistent quantum theory". Reviews of Modern Physics. 82 (4): 2835–2844. arXiv: 0909.2359 . doi:10.1103/RevModPhys.82.2835. ISSN   0034-6861.
  2. Griffiths, Robert B. "The Consistent Histories Approach to Quantum Mechanics". Stanford Encyclopedia of Philosophy. Stanford University. Retrieved 2016-10-22.
  3. Griffiths, Robert B. (1984). "Consistent histories and the interpretation of quantum mechanics". Journal of Statistical Physics. 36 (1–2). Springer Science and Business Media LLC: 219–272. Bibcode:1984JSP....36..219G. doi:10.1007/bf01015734. ISSN   0022-4715. S2CID   119871795.
  4. Griffiths, Robert B. (2003). Consistent quantum theory (First published in paperback ed.). Cambridge: Cambridge Univ. Press. ISBN   978-0-521-53929-6.
  5. Dowker, Fay; Kent, Adrian (1995-10-23). "Properties of Consistent Histories". Physical Review Letters. 75 (17): 3038–3041. arXiv: gr-qc/9409037 . Bibcode:1995PhRvL..75.3038D. doi:10.1103/physrevlett.75.3038. ISSN   0031-9007. PMID   10059479. S2CID   17359542.
  6. Omnès, Roland (1999). Understanding Quantum Mechanics . Princeton University Press. pp.  179, 257. ISBN   978-0-691-00435-8. LCCN   98042442.
  7. Kent, Adrian; McElwaine, Jim (1997-03-01). "Quantum prediction algorithms". Physical Review A. 55 (3): 1703–1720. arXiv: gr-qc/9610028 . Bibcode:1997PhRvA..55.1703K. doi:10.1103/physreva.55.1703. ISSN   1050-2947. S2CID   17821433.
  8. Griffiths, R. B. (2003). Consistent Quantum Theory. Cambridge University Press.
  9. R. Omnès, Quantum Philosophy , Princeton University Press, 1999. See part III, especially Chapter IX