Author | John von Neumann |
---|---|
Original title | Mathematische Grundlagen der Quantenmechanik |
Language | German |
Subject | Quantum mechanics |
Published | 1932 |
Publisher | Springer |
Publication place | Berlin, Germany |
Mathematical Foundations of Quantum Mechanics (German : Mathematische Grundlagen der Quantenmechanik) is a quantum mechanics book written by John von Neumann in 1932. It is an important early work in the development of the mathematical formulation of quantum mechanics. [1] The book mainly summarizes results that von Neumann had published in earlier papers. [2]
The book was originally published in German in 1932 by Springer. [2] An English translation by Robert T. Beyer was published in 1955 by Princeton University Press. A Russian translation, edited by Nikolay Bogolyubov, was published by Nauka in 1964. A new English edition, edited by Nicholas A. Wheeler, was published in 2018 by Princeton University Press. [3]
According to the 2018 version, the main chapters are: [3]
One significant passage is its mathematical argument against the idea of hidden variables. Von Neumann's claim rested on the assumption that any linear combination of Hermitian operators represents an observable and the expectation value of such combined operator follows the combination of the expectation values of the operators themselves. [4]
Von Neumann's makes the following assumptions: [5]
Von Neumann then shows that one can write
for some , where and are the matrix elements in some basis. The proof concludes by noting that must be Hermitian and non-negative definite () by construction. [5] For von Neumann, this meant that the statistical operator representation of states could be deduced from the postulates. Consequently, there are no "dispersion-free" states: [a] it is impossible to prepare a system in such a way that all measurements have predictable results. But if hidden variables existed, then knowing the values of the hidden variables would make the results of all measurements predictable, and hence there can be no hidden variables. [5] Von Neumann's argues that if dispersion-free states were found, assumptions 1 to 3 should be modified. [6]
Von Neumann's concludes: [7]
if there existed other, as yet undiscovered, physical quantities, in addition to those represented by the operators in quantum mechanics, because the relations assumed by quantum mechanics would have to fail already for the by now known quantities, those that we discussed above. It is therefore not, as is often assumed, a question of a re-interpretation of quantum mechanics, the present system of quantum mechanics would have to be objectively false, in order that another description of the elementary processes than the statistical one be possible.
— pp. 324-325
However, this proof was rejected as early as 1935 by Grete Hermann who found a flaw in the proof. [6] The additive postulate above holds for quantum states, but it does not need to apply for measurements of dispersion-free states, specifically when considering non-commuting observables. [5] [4] Dispersion-free states only require to recover additivity when averaging over the hidden parameters. [5] [4] For example, for a spin-1/2 system, measurements can take values for a dispersion-free state, but independent measurements of and can only take values of (their sum can be or 0). [8] Thus there still the possibility that a hidden variable theory could reproduce quantum mechanics statistically. [4] [5] [6]
However Hermann's critique remained relatively unknown until 1974 when it was rediscovered by Max Jammer. [6] In 1952, David Bohm constructed the Bohmian interpretation of quantum mechanics in terms of statistical argument, suggesting a limit to the validity of von Neumann's proof. [5] [4] The problem was brought back to wider attention by John Stewart Bell in 1966. [4] [5] Bell showed that the consequences of that assumption are at odds with results of incompatible measurements, which are not explicitly taken into von Neumann's considerations. [5]
It was considered the most complete book written in quantum mechanics at the time of release. [2] It was praised for its axiomatic approach. [2]
The mathematical formulations of quantum mechanics are those mathematical formalisms that permit a rigorous description of quantum mechanics. This mathematical formalism uses mainly a part of functional analysis, especially Hilbert spaces, which are a kind of linear space. Such are distinguished from mathematical formalisms for physics theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces, and operators on these spaces. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues; more precisely as spectral values of linear operators in Hilbert space.
The uncertainty principle, also known as Heisenberg's indeterminacy principle, is a fundamental concept in quantum mechanics. It states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. In other words, the more accurately one property is measured, the less accurately the other property can be known.
Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are supposed properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."
The Schrödinger equation is a partial differential equation that governs the wave function of a non-relativistic quantum-mechanical system. Its discovery was a significant landmark in the development of quantum mechanics. It is named after Erwin Schrödinger, who postulated the equation in 1925 and published it in 1926, forming the basis for the work that resulted in his Nobel Prize in Physics in 1933.
In quantum mechanics, a density matrix is a matrix that describes an ensemble of physical systems as quantum states. It allows for the calculation of the probabilities of the outcomes of any measurements performed upon the systems of the ensemble using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed ensembles. Mixed ensembles arise in quantum mechanics in two different situations:
In quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.
In quantum physics, a wave function is a mathematical description of the quantum state of an isolated quantum system. The most common symbols for a wave function are the Greek letters ψ and Ψ. Wave functions are complex-valued. For example, a wave function might assign a complex number to each point in a region of space. The Born rule provides the means to turn these complex probability amplitudes into actual probabilities. In one common form, it says that the squared modulus of a wave function that depends upon position is the probability density of measuring a particle as being at a given place. The integral of a wavefunction's squared modulus over all the system's degrees of freedom must be equal to 1, a condition called normalization. Since the wave function is complex-valued, only its relative phase and relative magnitude can be measured; its value does not, in isolation, tell anything about the magnitudes or directions of measurable observables. One has to apply quantum operators, whose eigenvalues correspond to sets of possible results of measurements, to the wave function ψ and calculate the statistical distributions for measurable quantities.
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden-variable theories. Experimental verification of the inequality being violated is seen as confirmation that nature cannot be described by such theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Stewart Bell's original inequality, is a constraint—on the statistical occurrence of "coincidences" in a Bell test—which is necessarily true if an underlying local hidden-variable theory exists. In practice, the inequality is routinely violated by modern experiments in quantum mechanics.
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
In quantum mechanics, a probability amplitude is a complex number used for describing the behaviour of systems. The square of the modulus of this quantity represents a probability density.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is
In functional analysis and quantum information science, a positive operator-valued measure (POVM) is a measure whose values are positive semi-definite operators on a Hilbert space. POVMs are a generalization of projection-valued measures (PVM) and, correspondingly, quantum measurements described by POVMs are a generalization of quantum measurement described by PVMs.
The Born rule is a postulate of quantum mechanics that gives the probability that a measurement of a quantum system will yield a given result. In one commonly used application, it states that the probability density for finding a particle at a given position is proportional to the square of the amplitude of the system's wavefunction at that position. It was formulated and published by German physicist Max Born in July, 1926.
In quantum mechanics, notably in quantum information theory, fidelity quantifies the "closeness" between two density matrices. It expresses the probability that one state will pass a test to identify as the other. It is not a metric on the space of density matrices, but it can be used to define the Bures metric on this space.
In quantum mechanics, the expectation value is the probabilistic expected value of the result (measurement) of an experiment. It can be thought of as an average of all the possible outcomes of a measurement as weighted by their likelihood, and as such it is not the most probable value of a measurement; indeed the expectation value may have zero probability of occurring. It is a fundamental concept in all areas of quantum physics.
In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system represented by the state. Knowledge of the quantum state, and the rules for the system's evolution in time, exhausts all that can be known about a quantum system.
This is a glossary for the terminology often encountered in undergraduate quantum mechanics courses.
The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.
In mathematical physics, the Dirac–von Neumann axioms give a mathematical formulation of quantum mechanics in terms of operators on a Hilbert space. They were introduced by Paul Dirac in 1930 and John von Neumann in 1932.