Part of a series of articles about |
Quantum mechanics |
---|
This article needs additional citations for verification .(December 2008) |
Quantum indeterminacy is the apparent necessary incompleteness in the description of a physical system, that has become one of the characteristics of the standard description of quantum physics. Prior to quantum physics, it was thought that
Quantum indeterminacy can be quantitatively characterized by a probability distribution on the set of outcomes of measurements of an observable. The distribution is uniquely determined by the system state, and moreover quantum mechanics provides a recipe for calculating this probability distribution.
Indeterminacy in measurement was not an innovation of quantum mechanics, since it had been established early on by experimentalists that errors in measurement may lead to indeterminate outcomes. By the later half of the 18th century, measurement errors were well understood, and it was known that they could either be reduced by better equipment or accounted for by statistical error models. In quantum mechanics, however, indeterminacy is of a much more fundamental nature, having nothing to do with errors or disturbance.
An adequate account of quantum indeterminacy requires a theory of measurement. Many theories have been proposed since the beginning of quantum mechanics and quantum measurement continues to be an active research area in both theoretical and experimental physics. [1] Possibly the first systematic attempt at a mathematical theory was developed by John von Neumann. The kinds of measurements he investigated are now called projective measurements. That theory was based in turn on the theory of projection-valued measures for self-adjoint operators that had been recently developed (by von Neumann and independently by Marshall Stone) and the Hilbert space formulation of quantum mechanics (attributed by von Neumann to Paul Dirac).
In this formulation, the state of a physical system corresponds to a vector of length 1 in a Hilbert space H over the complex numbers. An observable is represented by a self-adjoint (i.e. Hermitian) operator A on H. If H is finite dimensional, by the spectral theorem, A has an orthonormal basis of eigenvectors. If the system is in state ψ, then immediately after measurement the system will occupy a state that is an eigenvector e of A and the observed value λ will be the corresponding eigenvalue of the equation Ae = λe. It is immediate from this that measurement in general will be non-deterministic. Quantum mechanics, moreover, gives a recipe for computing a probability distribution Pr on the possible outcomes given the initial system state is ψ. The probability is where E(λ) is the projection onto the space of eigenvectors of A with eigenvalue λ.
In this example, we consider a single spin 1/2 particle (such as an electron) in which we only consider the spin degree of freedom. The corresponding Hilbert space is the two-dimensional complex Hilbert space C2, with each quantum state corresponding to a unit vector in C2 (unique up to phase). In this case, the state space can be geometrically represented as the surface of a sphere, as shown in the figure on the right.
The Pauli spin matrices are self-adjoint and correspond to spin-measurements along the 3 coordinate axes.
The Pauli matrices all have the eigenvalues +1, −1.
Thus in the state σ1 has the determinate value +1, while measurement of σ3 can produce either +1, −1 each with probability 1/2. In fact, there is no state in which measurement of both σ1 and σ3 have determinate values.
There are various questions that can be asked about the above indeterminacy assertion.
Von Neumann formulated the question 1) and provided an argument why the answer had to be no, if one accepted the formalism he was proposing. However, according to Bell, von Neumann's formal proof did not justify his informal conclusion. [2] A definitive but partial negative answer to 1) has been established by experiment: because Bell's inequalities are violated, any such hidden variable(s) cannot be local (see Bell test experiments).
The answer to 2) depends on how disturbance is understood, particularly since measurement entails disturbance (however note that this is the observer effect, which is distinct from the uncertainty principle). Still, in the most natural interpretation the answer is also no. To see this, consider two sequences of measurements: (A) that measures exclusively σ1 and (B) that measures only σ3 of a spin system in the state ψ. The measurement outcomes of (A) are all +1, while the statistical distribution of the measurements (B) is still divided between +1, −1 with equal probability.
Quantum indeterminacy can also be illustrated in terms of a particle with a definitely measured momentum for which there must be a fundamental limit to how precisely its location can be specified. This quantum uncertainty principle can be expressed in terms of other variables, for example, a particle with a definitely measured energy has a fundamental limit to how precisely one can specify how long it will have that energy. The magnitude involved in quantum uncertainty is on the order of the Planck constant (6.62607015×10−34 J⋅Hz−1 [3] ).
Quantum indeterminacy is the assertion that the state of a system does not determine a unique collection of values for all its measurable properties. Indeed, according to the Kochen–Specker theorem, in the quantum mechanical formalism it is impossible that, for a given quantum state, each one of these measurable properties (observables) has a determinate (sharp) value. The values of an observable will be obtained non-deterministically in accordance with a probability distribution that is uniquely determined by the system state. Note that the state is destroyed by measurement, so when we refer to a collection of values, each measured value in this collection must be obtained using a freshly prepared state.
This indeterminacy might be regarded as a kind of essential incompleteness in our description of a physical system. Notice however, that the indeterminacy as stated above only applies to values of measurements not to the quantum state. For example, in the spin 1/2 example discussed above, the system can be prepared in the state ψ by using measurement of σ1 as a filter that retains only those particles such that σ1 yields +1. By the von Neumann (so-called) postulates, immediately after the measurement the system is assuredly in the state ψ.
However, Albert Einstein believed that quantum state cannot be a complete description of a physical system and, it is commonly thought, never came to terms with quantum mechanics. In fact, Einstein, Boris Podolsky and Nathan Rosen showed that if quantum mechanics is correct, then the classical view of how the real world works (at least after special relativity) is no longer tenable. This view included the following two ideas:
This failure of the classical view was one of the conclusions of the EPR thought experiment in which two remotely located observers, now commonly referred to as Alice and Bob, perform independent measurements of spin on a pair of electrons, prepared at a source in a special state called a spin singlet state. It was a conclusion of EPR, using the formal apparatus of quantum theory, that once Alice measured spin in the x direction, Bob's measurement in the x direction was determined with certainty, whereas immediately before Alice's measurement Bob's outcome was only statistically determined. From this it follows that either value of spin in the x direction is not an element of reality or that the effect of Alice's measurement has infinite speed of propagation.
We have described indeterminacy for a quantum system that is in a pure state. Mixed states are a more general kind of state obtained by a statistical mixture of pure states. For mixed states the "quantum recipe" for determining the probability distribution of a measurement is determined as follows:
Let A be an observable of a quantum mechanical system. A is given by a densely defined self-adjoint operator on H. The spectral measure of A is a projection-valued measure defined by the condition
for every Borel subset U of R. Given a mixed state S, we introduce the distribution of A under S as follows:
This is a probability measure defined on the Borel subsets of R that is the probability distribution obtained by measuring A in S.
Quantum indeterminacy is often understood as information (or lack of it) whose existence we infer, occurring in individual quantum systems, prior to measurement. Quantum randomness is the statistical manifestation of that indeterminacy, witnessable in results of experiments repeated many times. However, the relationship between quantum indeterminacy and randomness is subtle and can be considered differently. [4]
In classical physics, experiments of chance, such as coin-tossing and dice-throwing, are deterministic, in the sense that, perfect knowledge of the initial conditions would render outcomes perfectly predictable. The ‘randomness’ stems from ignorance of physical information in the initial toss or throw. In diametrical contrast, in the case of quantum physics, the theorems of Kochen and Specker, [5] the inequalities of John Bell, [6] and experimental evidence of Alain Aspect, [7] [8] all indicate that quantum randomness does not stem from any such physical information.
In 2008, Tomasz Paterek et al. provided an explanation in mathematical information. They proved that quantum randomness is, exclusively, the output of measurement experiments whose input settings introduce logical independence into quantum systems. [9] [10]
Logical independence is a well-known phenomenon in Mathematical Logic. It refers to the null logical connectivity that exists between mathematical propositions (in the same language) that neither prove nor disprove one another. [11]
In the work of Paterek et al., the researchers demonstrate a link connecting quantum randomness and logical independence in a formal system of Boolean propositions. In experiments measuring photon polarisation, Paterek et al. demonstrate statistics correlating predictable outcomes with logically dependent mathematical propositions, and random outcomes with propositions that are logically independent. [12] [13]
In 2020, Steve Faulkner reported on work following up on the findings of Tomasz Paterek et al.; showing what logical independence in the Paterek Boolean propositions means, in the domain of Matrix Mechanics proper. He showed how indeterminacy's indefiniteness arises in evolved density operators representing mixed states, where measurement processes encounter irreversible 'lost history' and ingression of ambiguity. [14]
The mathematical formulations of quantum mechanics are those mathematical formalisms that permit a rigorous description of quantum mechanics. This mathematical formalism uses mainly a part of functional analysis, especially Hilbert spaces, which are a kind of linear space. Such are distinguished from mathematical formalisms for physics theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces, and operators on these spaces. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues; more precisely as spectral values of linear operators in Hilbert space.
Quantum mechanics is a fundamental theory that describes the behavior of nature at and below the scale of atoms. It is the foundation of all quantum physics, which includes quantum chemistry, quantum field theory, quantum technology, and quantum information science.
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are putative properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
In quantum mechanics, a density matrix is a matrix that describes an ensemble of physical systems as quantum states. It allows for the calculation of the probabilities of the outcomes of any measurements performed upon the systems of the ensemble using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed ensembles. Mixed ensembles arise in quantum mechanics in two different situations:
In quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation, and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden-variable theories. Experimental verification of the inequality being violated is seen as confirmation that nature cannot be described by such theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Stewart Bell's original inequality, is a constraint—on the statistical occurrence of "coincidences" in a Bell test—which is necessarily true if an underlying local hidden-variable theory exists. In practice, the inequality is routinely violated by modern experiments in quantum mechanics.
In physics, an observable is a physical property or physical quantity that can be measured. In classical mechanics, an observable is a real-valued "function" on the set of all possible system states, e.g., position and momentum. In quantum mechanics, an observable is an operator, or gauge, where the property of the quantum state can be determined by some sequence of operations. For example, these operations might involve submitting the system to various electromagnetic fields and eventually reading a value.
In quantum mechanics, a probability amplitude is a complex number used for describing the behaviour of systems. The square of the modulus of this quantity represents a probability density.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.
Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. In quantum mechanics a statistical ensemble is described by a density operator S, which is a non-negative, self-adjoint, trace-class operator of trace 1 on the Hilbert space H describing the quantum system. This can be shown under various mathematical formalisms for quantum mechanics.
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is
The Born rule is a postulate of quantum mechanics that gives the probability that a measurement of a quantum system will yield a given result. In its simplest form, it states that the probability density of finding a system in a given state, when measured, is proportional to the square of the amplitude of the system's wavefunction at that state. It was formulated and published by German physicist Max Born in July, 1926.
In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not allow an interpretation with local realism. Quantum nonlocality has been experimentally verified under a variety of physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore cannot fulfill local realism; quantum nonlocality is a property of the universe that is independent of our description of nature.
In quantum mechanics, the expectation value is the probabilistic expected value of the result (measurement) of an experiment. It can be thought of as an average of all the possible outcomes of a measurement as weighted by their likelihood, and as such it is not the most probable value of a measurement; indeed the expectation value may have zero probability of occurring. It is a fundamental concept in all areas of quantum physics.
Spin is an intrinsic form of angular momentum carried by elementary particles, and thus by composite particles such as hadrons, atomic nuclei, and atoms. Spin is quantized, and accurate models for the interaction with spin require relativistic quantum mechanics or quantum field theory.
In physics, the Leggett inequalities, named for Anthony James Leggett, who derived them, are a related pair of mathematical expressions concerning the correlations of properties of entangled particles.
In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a quantum-mechanical prediction for the system represented by the state. Knowledge of the quantum state, and the quantum mechanical rules for the system's evolution in time, exhausts all that can be known about a quantum system.
The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.