Consistent histories

Last updated

In quantum mechanics, the consistent histories [1] (also referred to as decoherent histories) [2] approach is intended to give a modern interpretation of quantum mechanics, generalising the conventional Copenhagen interpretation and providing a natural interpretation of quantum cosmology. [3] This interpretation of quantum mechanics is based on a consistency criterion that then allows probabilities to be assigned to various alternative histories of a system such that the probabilities for each history obey the rules of classical probability while being consistent with the Schrödinger equation. In contrast to some interpretations of quantum mechanics, particularly the Copenhagen interpretation, the framework does not include "wavefunction collapse" as a relevant description of any physical process, and emphasizes that measurement theory is not a fundamental ingredient of quantum mechanics.

Contents

Histories

A homogeneous history (here labels different histories) is a sequence of Propositions specified at different moments of time (here labels the times). We write this as:

and read it as "the proposition is true at time and then the proposition is true at time and then". The times are strictly ordered and called the temporal support of the history.

Inhomogeneous histories are multiple-time propositions which cannot be represented by a homogeneous history. An example is the logical OR of two homogeneous histories: .

These propositions can correspond to any set of questions that include all possibilities. Examples might be the three propositions meaning "the electron went through the left slit", "the electron went through the right slit" and "the electron didn't go through either slit". One of the aims of the theory is to show that classical questions such as, "where are my keys?" are consistent. In this case one might use a large number of propositions each one specifying the location of the keys in some small region of space.

Each single-time proposition can be represented by a projection operator acting on the system's Hilbert space (we use "hats" to denote operators). It is then useful to represent homogeneous histories by the time-ordered product of their single-time projection operators. This is the history projection operator (HPO) formalism developed by Christopher Isham and naturally encodes the logical structure of the history propositions.

Consistency

An important construction in the consistent histories approach is the class operator for a homogeneous history:

The symbol indicates that the factors in the product are ordered chronologically according to their values of : the "past" operators with smaller values of appear on the right side, and the "future" operators with greater values of appear on the left side. This definition can be extended to inhomogeneous histories as well.

Central to the consistent histories is the notion of consistency. A set of histories is consistent (or strongly consistent) if

for all . Here represents the initial density matrix, and the operators are expressed in the Heisenberg picture.

The set of histories is weakly consistent if

for all .

Probabilities

If a set of histories is consistent then probabilities can be assigned to them in a consistent way. We postulate that the probability of history is simply

which obeys the axioms of probability if the histories come from the same (strongly) consistent set.

As an example, this means the probability of " OR " equals the probability of "" plus the probability of "" minus the probability of " AND ", and so forth.

Interpretation

The interpretation based on consistent histories is used in combination with the insights about quantum decoherence. Quantum decoherence implies that irreversible macroscopic phenomena (hence, all classical measurements) render histories automatically consistent, which allows one to recover classical reasoning and "common sense" when applied to the outcomes of these measurements. More precise analysis of decoherence allows (in principle) a quantitative calculation of the boundary between the classical domain and the quantum domain covariance. According to Roland Omnès, [4]

[the] history approach, although it was initially independent of the Copenhagen approach, is in some sense a more elaborate version of it. It has, of course, the advantage of being more precise, of including classical physics, and of providing an explicit logical framework for indisputable proofs. But, when the Copenhagen interpretation is completed by the modern results about correspondence and decoherence, it essentially amounts to the same physics.

[... There are] three main differences:

1. The logical equivalence between an empirical datum, which is a macroscopic phenomenon, and the result of a measurement, which is a quantum property, becomes clearer in the new approach, whereas it remained mostly tacit and questionable in the Copenhagen formulation.

2. There are two apparently distinct notions of probability in the new approach. One is abstract and directed toward logic, whereas the other is empirical and expresses the randomness of measurements. We need to understand their relation and why they coincide with the empirical notion entering into the Copenhagen rules.

3. The main difference lies in the meaning of the reduction rule for 'wave packet collapse'. In the new approach, the rule is valid but no specific effect on the measured object can be held responsible for it. Decoherence in the measuring device is enough.

In order to obtain a complete theory, the formal rules above must be supplemented with a particular Hilbert space and rules that govern dynamics, for example a Hamiltonian.

In the opinion of others [5] this still does not make a complete theory as no predictions are possible about which set of consistent histories will actually occur. That is the rules of consistent histories, the Hilbert space, and the Hamiltonian must be supplemented by a set selection rule. However, Robert B. Griffiths holds the opinion that asking the question of which set of histories will "actually occur" is a misinterpretation of the theory; [6] histories are a tool for description of reality, not separate alternate realities.

Proponents of this consistent histories interpretation—such as Murray Gell-Mann, James Hartle, Roland Omnès and Robert B. Griffiths—argue that their interpretation clarifies the fundamental disadvantages of the old Copenhagen interpretation, and can be used as a complete interpretational framework for quantum mechanics.

In Quantum Philosophy , [7] Roland Omnès provides a less mathematical way of understanding this same formalism.

The consistent histories approach can be interpreted as a way of understanding which properties of a quantum system can be treated in a single framework, and which properties must be treated in different frameworks and would produce meaningless results if combined as if they belonged to a single framework. It thus becomes possible to demonstrate formally why it is that the properties which J. S. Bell assumed could be combined together, cannot. On the other hand, it also becomes possible to demonstrate that classical, logical reasoning does apply, even to quantum experiments – but we can now be mathematically exact about how such reasoning applies.

See also

Related Research Articles

Many-worlds interpretation Interpretation of quantum mechanics which denies the collapse of the wavefunction

The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wavefunction collapse. This implies that all possible outcomes of quantum measurements are physically realized in some "world" or universe. In contrast to some other interpretations, such as the Copenhagen interpretation, the evolution of reality as a whole in MWI is rigidly deterministic. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1960s and 1970s.

The mathematical formulations of quantum mechanics are those mathematical formalisms that permit a rigorous description of quantum mechanics. This mathematical formalism uses mainly a part of functional analysis, especially Hilbert space which is a kind of linear space. Such are distinguished from mathematical formalisms for physics theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces(L2 space mainly), and operators on these spaces. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues; more precisely as spectral values of linear operators in Hilbert space.

In quantum mechanics, the uncertainty principle is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which the values for certain pairs of physical quantities of a particle, such as position, x, and momentum, p, can be predicted from initial conditions. Such variable pairs are known as complementary variables or canonically conjugate variables, and, depending on interpretation, the uncertainty principle limits to what extent such conjugate properties maintain their approximate meaning, as the mathematical framework of quantum physics does not support the notion of simultaneously well-defined conjugate properties expressed by a single value. The uncertainty principle implies that it is in general not possible to predict the value of a quantity with arbitrary certainty, even if all initial conditions are specified.

The de Broglie–Bohm theory, also known as the pilot wave theory, Bohmian mechanics, Bohm's interpretation, and the causal interpretation, is an interpretation of quantum mechanics. In addition to a wavefunction on the space of all possible configurations, it also postulates an actual configuration that exists even when unobserved. The evolution over time of the configuration is defined by a guiding equation that is the nonlocal part of the wave function. The evolution of the wave function over time is given by the Schrödinger equation. The theory is named after Louis de Broglie (1892–1987) and David Bohm (1917–1992).

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics "corresponds" to reality. Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, which elements of quantum mechanics can be considered real, and what is the nature of measurement, among other matters.

A density matrix is a matrix that describes the statistical state of a system in quantum mechanics. The probability for any outcome of any well-defined measurement upon a system can be calculated from the density matrix for that system. The extreme points in the set of density matrices are the pure states, which can also be written as state vectors or wavefunctions. Density matrices that are not pure states are mixed states. Any mixed state can be represented as a convex combination of pure states, and so density matrices are helpful for dealing with statistical ensembles of different possible preparations of a quantum system, or situations where a precise preparation is not known, as in quantum statistical mechanics.

In quantum mechanics, wave function collapse occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an "observation". It is the essence of a measurement in quantum mechanics which connects the wave function with classical observables like position and momentum. Collapse is one of two processes by which quantum systems evolve in time; the other is the continuous evolution via the Schrödinger equation. Collapse is a black box for a thermodynamically irreversible interaction with a classical environment. Calculations of quantum decoherence show that when a quantum system interacts with the environment, the superpositions apparently reduce to mixtures of classical alternatives. Significantly, the combined wave function of the system and environment continue to obey the Schrödinger equation. More importantly, this is not enough to explain wave function collapse, as decoherence does not reduce it to a single eigenstate.

Quantum decoherence Loss of quantum coherence

Quantum decoherence is the loss of quantum coherence. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.

Quantum indeterminacy is the apparent necessary incompleteness in the description of a physical system, that has become one of the characteristics of the standard description of quantum physics. Prior to quantum physics, it was thought that

In quantum physics, a measurement is the testing or manipulation of a physical system in order to yield a numerical result. The predictions that quantum physics makes are in general probabilistic. The mathematical tools for making predictions about what measurement outcomes may occur were developed during the 20th century and make use of linear algebra and functional analysis.

In quantum mechanics, quantum logic is a set of rules for reasoning about propositions that takes the principles of quantum theory into account. This research area and its name originated in a 1936 paper by Garrett Birkhoff and John von Neumann, who were attempting to reconcile the apparent inconsistency of classical logic with the facts concerning the measurement of complementary variables in quantum mechanics, such as position and momentum.

In quantum mechanics, a quantum operation is a mathematical formalism used to describe a broad class of transformations that a quantum mechanical system can undergo. This was first discussed as a general stochastic transformation for a density matrix by George Sudarshan. The quantum operation formalism describes not only unitary time evolution or symmetry transformations of isolated systems, but also the effects of measurement and transient interactions with an environment. In the context of quantum computation, a quantum operation is called a quantum channel.

Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. In quantum mechanics a statistical ensemble is described by a density operator S, which is a non-negative, self-adjoint, trace-class operator of trace 1 on the Hilbert space H describing the quantum system. This can be shown under various mathematical formalisms for quantum mechanics. One such formalism is provided by quantum logic.

In linear algebra and functional analysis, the partial trace is a generalization of the trace. Whereas the trace is a scalar valued function on operators, the partial trace is an operator-valued function. The partial trace has applications in quantum information and decoherence which is relevant for quantum measurement and thereby to the decoherent approaches to interpretations of quantum mechanics, including consistent histories and the relative state interpretation.

Wigner quasiprobability distribution The Wigner distribution function in physics as opposed to in signal processing

The Wigner quasiprobability distribution is a quasiprobability distribution. It was introduced by Eugene Wigner in 1932 to study quantum corrections to classical statistical mechanics. The goal was to link the wavefunction that appears in Schrödinger's equation to a probability distribution in phase space.

In quantum statistical mechanics, the von Neumann entropy, named after John von Neumann, is the extension of classical Gibbs entropy concepts to the field of quantum mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is

The History Projection Operator (HPO) formalism is an approach to temporal quantum logic developed by Chris Isham. It deals with the logical structure of quantum mechanical propositions asserted at different points in time.

This is a glossary for the terminology applied in the foundations of quantum mechanics and quantum metaphysics, collectively called quantum philosophy, a subfield of philosophy of physics.

In quantum information theory, the classical capacity of a quantum channel is the maximum rate at which classical data can be sent over it error-free in the limit of many uses of the channel. Holevo, Schumacher, and Westmoreland proved the following least upper bound on the classical capacity of any quantum channel :

Quantum Bayesianism Interpretation of quantum mechanics

In physics and the philosophy of physics, quantum Bayesianism is an interpretation of quantum mechanics that takes an agent's actions and experiences as the central concerns of the theory. QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.

References

  1. Griffiths, Robert B. (1984). "Consistent histories and the interpretation of quantum mechanics". Journal of Statistical Physics. Springer Science and Business Media LLC. 36 (1–2): 219–272. doi:10.1007/bf01015734. ISSN   0022-4715.
  2. Griffiths, Robert B. "The Consistent Histories Approach to Quantum Mechanics". Stanford Encyclopedia of Philosophy. Stanford University. Retrieved 2016-10-22.
  3. Dowker, Fay; Kent, Adrian (1995-10-23). "Properties of Consistent Histories". Physical Review Letters. American Physical Society (APS). 75 (17): 3038–3041. arXiv: gr-qc/9409037 . doi:10.1103/physrevlett.75.3038. ISSN   0031-9007.
  4. Omnès, Roland (1999). Understanding Quantum Mechanics . Princeton University Press. pp.  179, 257. ISBN   978-0-691-00435-8. LCCN   98042442.
  5. Kent, Adrian; McElwaine, Jim (1997-03-01). "Quantum prediction algorithms". Physical Review A. American Physical Society (APS). 55 (3): 1703–1720. arXiv: gr-qc/9610028 . doi:10.1103/physreva.55.1703. ISSN   1050-2947.
  6. Griffiths, R. B. (2003). Consistent Quantum Theory. Cambridge University Press.
  7. R. Omnès, Quantum Philosophy , Princeton University Press, 1999. See part III, especially Chapter IX