In quantum mechanics, einselections, short for "environment-induced superselection", is a name coined by Wojciech H. Zurek [1] for a process which is claimed to explain the appearance of wavefunction collapse and the emergence of classical descriptions of reality from quantum descriptions. In this approach, classicality is described as an emergent property induced in open quantum systems by their environments. Due to the interaction with the environment, the vast majority of states in the Hilbert space of a quantum open system become highly unstable due to entangling interaction with the environment, which in effect monitors selected observables of the system. After a decoherence time, which for macroscopic objects is typically many orders of magnitude shorter than any other dynamical timescale, [2] a generic quantum state decays into an uncertain state which can be expressed as a mixture of simple pointer states. In this way the environment induces effective superselection rules. Thus, einselection precludes stable existence of pure superpositions of pointer states. These 'pointer states' are stable despite environmental interaction. The einselected states lack coherence, and therefore do not exhibit the quantum behaviours of entanglement and superposition.
Advocates of this approach argue that since only quasi-local, essentially classical states survive the decoherence process, einselection can in many ways explain the emergence of a (seemingly) classical reality in a fundamentally quantum universe (at least to local observers). However, the basic program has been criticized as relying on a circular argument (e.g. by Ruth Kastner). [3] So the question of whether the 'einselection' account can really explain the phenomenon of wave function collapse remains unsettled.
Zurek has defined einselection as follows: "Decoherence leads to einselection when the states of the environment corresponding to different pointer states become orthogonal: ", [1]
Einselected pointer states are distinguished by their ability to persist in spite of the environmental monitoring and therefore are the ones in which quantum open systems are observed. Understanding the nature of these states and the process of their dynamical selection is of fundamental importance. This process has been studied first in a measurement situation: When the system is an apparatus whose intrinsic dynamics can be neglected, pointer states turn out to be eigenstates of the interaction Hamiltonian between the apparatus and its environment. [4] In more general situations, when the system's dynamics is relevant, einselection is more complicated. Pointer states result from the interplay between self-evolution and environmental monitoring.
To study einselection, an operational definition of pointer states has been introduced. [5] [6] This is the "predictability sieve" criterion, based on an intuitive idea: Pointer states can be defined as the ones which become minimally entangled with the environment in the course of their evolution. The predictability sieve criterion is a way to quantify this idea by using the following algorithmic procedure: For every initial pure state , one measures the entanglement generated dynamically between the system and the environment by computing the entropy:
or some other measure of predictability [5] [6] [7] from the reduced density matrix of the system (which is initially ). The entropy is a function of time and a functional of the initial state . Pointer states are obtained by minimizing over and demanding that the answer be robust when varying the time .
The nature of pointer states has been investigated using the predictability sieve criterion only for a limited number of examples. [5] [6] [7] Apart from the already mentioned case of the measurement situation (where pointer states are simply eigenstates of the interaction Hamiltonian) the most notable example is that of a quantum Brownian particle coupled through its position with a bath of independent harmonic oscillators. In such case pointer states are localized in phase space, even though the interaction Hamiltonian involves the position of the particle. [6] Pointer states are the result of the interplay between self-evolution and interaction with the environment and turn out to be coherent states.
There is also a quantum limit of decoherence: When the spacing between energy levels of the system is large compared to the frequencies present in the environment, energy eigenstates are einselected nearly independently of the nature of the system-environment coupling. [8]
There has been significant work on correctly identifying the pointer states in the case of a massive particle decohered by collisions with a fluid environment, often known as collisional decoherence. In particular, Busse and Hornberger have identified certain solitonic wavepackets as being unusually stable in the presence of such decoherence. [9] [10]
In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theorem is an evolution of the 1970 no-go theorem authored by James Park, in which he demonstrates that a non-disturbing measurement scheme which is both simple and perfect cannot exist. The aforementioned theorems do not preclude the state of one system becoming entangled with the state of another as cloning specifically refers to the creation of a separable state with identical factors. For example, one might use the controlled NOT gate and the Walsh–Hadamard gate to entangle two qubits without violating the no-cloning theorem as no well-defined state may be defined in terms of a subsystem of an entangled state. The no-cloning theorem concerns only pure states whereas the generalized statement regarding mixed states is known as the no-broadcast theorem.
Quantum entanglement is the phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
In quantum mechanics, wave function collapse occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation, and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation. Collapse is a black box for a thermodynamically irreversible interaction with a classical environment.
Quantum decoherence is the loss of quantum coherence, the process in which a system's behaviour changes from that which can be explained by quantum mechanics to that which can be explained by classical mechanics. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.
Quantum error correction (QEC) is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is theorised as essential to achieve fault tolerant quantum computing that can reduce the effects of noise on stored quantum information, faulty quantum gates, faulty quantum preparation, and faulty measurements. This would allow algorithms of greater circuit depth.
Quantum Darwinism is a theory meant to explain the emergence of the classical world from the quantum world as due to a process of Darwinian natural selection induced by the environment interacting with the quantum system; where the many possible quantum states are selected against in favor of a stable pointer state. It was proposed in 2003 by Wojciech Zurek and a group of collaborators including Ollivier, Poulin, Paz and Blume-Kohout. The development of the theory is due to the integration of a number of Zurek's research topics pursued over the course of 25 years, including pointer states, einselection and decoherence.
A qutrit is a unit of quantum information that is realized by a 3-level quantum system, that may be in a superposition of three mutually orthogonal quantum states.
The Peres–Horodecki criterion is a necessary condition, for the joint density matrix of two quantum mechanical systems and , to be separable. It is also called the PPT criterion, for positive partial transpose. In the 2×2 and 2×3 dimensional cases the condition is also sufficient. It is used to decide the separability of mixed states, where the Schmidt decomposition does not apply. The theorem was discovered in 1996 by Asher Peres and the Horodecki family
In quantum mechanics, separable states are multipartite quantum states that can be written as a convex combination of product states. Product states are multipartite quantum states that can be written as a tensor product of states in each space. The physical intuition behind these definitions is that product states have no correlation between the different degrees of freedom, while separable states might have correlations, but all such correlations can be explained as due to a classical random variable, as opposed as being due to entanglement.
The Born rule is a postulate of quantum mechanics which gives the probability that a measurement of a quantum system will yield a given result. In its simplest form, it states that the probability density of finding a system in a given state, when measured, is proportional to the square of the amplitude of the system's wavefunction at that state. It was formulated and published by German physicist Max Born in July, 1926.
Time-bin encoding is a technique used in quantum information science to encode a qubit of information on a photon. Quantum information science makes use of qubits as a basic resource similar to bits in classical computing. Qubits are any two-level quantum mechanical system; there are many different physical implementations of qubits, one of which is time-bin encoding.
In quantum information and quantum computing, a cluster state is a type of highly entangled state of multiple qubits. Cluster states are generated in lattices of qubits with Ising type interactions. A cluster C is a connected subset of a d-dimensional lattice, and a cluster state is a pure state of the qubits located on C. They are different from other types of entangled states such as GHZ states or W states in that it is more difficult to eliminate quantum entanglement in the case of cluster states. Another way of thinking of cluster states is as a particular instance of graph states, where the underlying graph is a connected subset of a d-dimensional lattice. Cluster states are especially useful in the context of the one-way quantum computer. For a comprehensible introduction to the topic see.
The Ghirardi–Rimini–Weber theory (GRW) is a spontaneous collapse theory in quantum mechanics, proposed in 1986 by Giancarlo Ghirardi, Alberto Rimini, and Tullio Weber.
A decoherence-free subspace (DFS) is a subspace of a quantum system's Hilbert space that is invariant to non-unitary dynamics. Alternatively stated, they are a small section of the system Hilbert space where the system is decoupled from the environment and thus its evolution is completely unitary. DFSs can also be characterized as a special class of quantum error correcting codes. In this representation they are passive error-preventing codes since these subspaces are encoded with information that (possibly) won't require any active stabilization methods. These subspaces prevent destructive environmental interactions by isolating quantum information. As such, they are an important subject in quantum computing, where (coherent) control of quantum systems is the desired goal. Decoherence creates problems in this regard by causing loss of coherence between the quantum states of a system and therefore the decay of their interference terms, thus leading to loss of information from the (open) quantum system to the surrounding environment. Since quantum computers cannot be isolated from their environment and information can be lost, the study of DFSs is important for the implementation of quantum computers into the real world.
In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum physical effects but do not necessarily involve quantum entanglement.
The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.
In quantum Darwinism and similar theories, pointer states are quantum states, sometimes of a measuring apparatus, if present, that are less perturbed by decoherence than other states, and are the quantum equivalents of the classical states of the system after decoherence has occurred through interaction with the environment. 'Pointer' refers to the reading of a recording or measuring device, which in old analog versions would often have a gauge or pointer display.
In quantum mechanics, weak measurements are a type of quantum measurement that results in an observer obtaining very little information about the system on average, but also disturbs the state very little. From Busch's theorem the system is necessarily disturbed by the measurement. In the literature weak measurements are also known as unsharp, fuzzy, dull, noisy, approximate, and gentle measurements. Additionally weak measurements are often confused with the distinct but related concept of the weak value.
This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.