Part of a series of articles about |
Quantum mechanics |
---|
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
In quantum mechanics, physical systems are described by a mathematical representation called a quantum state. Probabilities for the outcomes of experiments upon a system are calculated by applying the Born rule to the quantum state describing that system. Quantum states are either pure or mixed; pure states are also known as wavefunctions. Assigning a pure state to a quantum system implies certainty about the outcome of some measurement on that system, i.e., that there exists a measurement for which one of the possible outcomes will occur with probability 1. In the absence of outside forces or interactions, a quantum state evolves unitarily over time. Consequently, a pure quantum state remains pure. However, if the system is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time ─ a process called quantum decoherence or environmental decoherence. The quantum coherence is not lost but rather mixed with many more degrees of freedom in the environment, analogous to the way energy appears to be lost in by friction in classical mechanics when it actually has produced heat in the environment.
Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath), [1] since every system is loosely coupled with the energetic state of its surroundings. Viewed in isolation, the system's dynamics are non-unitary (although the combined system plus environment evolves in a unitary fashion). [2] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings.
An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum physics might correspond to experienced reality. [3] Decoherence calculations can be done in any interpretation of quantum mechanics, since those calculations are an application of the standard mathematical tools of quantum theory. However, the subject of decoherence has been closely related to the problem of interpretation throughout its history. [4] [5]
Decoherence has been used to understand the possibility of the collapse of the wave function in quantum mechanics. Decoherence does not generate actual wave-function collapse. It only provides a framework for apparent wave-function collapse, as the components of a quantum system entangle with other quantum systems within the same environment. That is, components of the wave function are decoupled from a coherent system and acquire phases from their immediate surroundings. A total superposition of the global or universal wavefunction still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue.
With respect to the measurement problem, decoherence provides an explanation for the transition of the system to a mixture of states that seem to correspond to those states observers perceive. Moreover, observation indicates that this mixture looks like a proper quantum ensemble in a measurement situation, as the measurements lead to the "realization" of precisely one state in the "ensemble".
The philosophical views of Werner Heisenberg and Niels Bohr have often been grouped together as the "Copenhagen interpretation", despite significant divergences between them on important points. [6] [7] In 1955, Heisenberg suggested that the interaction of a system with its surrounding environment would eliminate quantum interference effects. However, Heisenberg did not provide a detailed account of how this might transpire, nor did he make explicit the importance of entanglement in the process. [7] [8]
Nevill Mott's solution to the iconic Mott problem in 1929 is considered in retrospect to be the first quantum decoherence work. [9] It was cited by the first modern theoretical treatment. [10]
Although he did not use the term, the concept of quantum decoherence was first introduced in 1951 by the American physicist David Bohm, [11] [12] who called it the "destruction of interference in the process of measurement". Bohm later used decoherence to handle the measurement process in the de Broglie-Bohm interpretation of quantum theory. [13]
The significance of decoherence was further highlighted in 1970 by the German physicist H. Dieter Zeh, [14] and it has been a subject of active research since the 1980s. [15] Decoherence has been developed into a complete framework, but there is controversy as to whether it solves the measurement problem, as the founders of decoherence theory admit in their seminal papers. [16]
The study of decoherence as a proper subject began in 1970, with H. Dieter Zeh's paper "On the Interpretation of Measurement in Quantum Theory". [4] [14] Zeh regarded the wavefunction as a physical entity, rather than a calculational device or a compendium of statistical information (as is typical for Copenhagen-type interpretations), and he proposed that it should evolve unitarily, in accord with the Schrödinger equation, at all times. Zeh was initially unaware of Hugh Everett III's earlier work, [17] which also proposed a universal wavefunction evolving unitarily; he revised his paper to reference Everett after learning of Everett's "relative-state interpretation" through an article by Bryce DeWitt. [4] (DeWitt was the one who termed Everett's proposal the many-worlds interpretation, by which name it is commonly known.) For Zeh, the question of how to interpret quantum mechanics was of key importance, and an interpretation along the lines of Everett's was the most natural. Partly because of a general disinterest among physicists for interpretational questions, Zeh's work remained comparatively neglected until the early 1980s, when two papers by Wojciech Zurek [18] [19] invigorated the subject. Unlike Zeh's publications, Zurek's articles were fairly agnostic about interpretation, focusing instead on specific problems of density-matrix dynamics. Zurek's interest in decoherence stemmed from furthering Bohr's analysis of the double-slit experiment in his reply to the Einstein–Podolsky–Rosen paradox, work he had undertaken with Bill Wootters, [20] and he has since argued that decoherence brings a kind of rapprochement between Everettian and Copenhagen-type views. [4] [21]
Decoherence does not claim to provide a mechanism for some actual wave-function collapse; rather it puts forth a reasonable framework for the appearance of wave-function collapse. The quantum nature of the system is simply entangled into the environment so that a total superposition of the wave function still exists, but exists—at least for all practical purposes—beyond the realm of measurement. [22] [23] By definition, the claim that a merged but unmeasurable wave function still exists cannot be proven experimentally. Decoherence is needed to understand why a quantum system begins to obey classical probability rules after interacting with its environment (due to the suppression of the interference terms when applying Born's probability rules to the system).
Criticism of the adequacy of decoherence theory to solve the measurement problem has been expressed by Anthony Leggett. [24] [25]
To examine how decoherence operates, an "intuitive" model is presented below. The model requires some familiarity with quantum theory basics. Analogies are made between visualizable classical phase spaces and Hilbert spaces. A more rigorous derivation in Dirac notation shows how decoherence destroys interference effects and the "quantum nature" of systems. Next, the density matrix approach is presented for perspective.
An N-particle system can be represented in non-relativistic quantum mechanics by a wave function , where each xi is a point in 3-dimensional space. This has analogies with the classical phase space. A classical phase space contains a real-valued function in 6N dimensions (each particle contributes 3 spatial coordinates and 3 momenta). In this case a "quantum" phase space, on the other hand, involves a complex-valued function on a 3N-dimensional space. The position and momenta are represented by operators that do not commute, and lives in the mathematical structure of a Hilbert space. Aside from these differences, however, the rough analogy holds.
Different previously isolated, non-interacting systems occupy different phase spaces. Alternatively we can say that they occupy different lower-dimensional subspaces in the phase space of the joint system. The effective dimensionality of a system's phase space is the number of degrees of freedom present, which—in non-relativistic models—is 6 times the number of a system's free particles. For a macroscopic system this will be a very large dimensionality. When two systems (the environment being one system) start to interact, though, their associated state vectors are no longer constrained to the subspaces. Instead the combined state vector time-evolves a path through the "larger volume", whose dimensionality is the sum of the dimensions of the two subspaces. The extent to which two vectors interfere with each other is a measure of how "close" they are to each other (formally, their overlap or Hilbert space multiplies together) in the phase space. When a system couples to an external environment, the dimensionality of, and hence "volume" available to, the joint state vector increases enormously. Each environmental degree of freedom contributes an extra dimension.
The original system's wave function can be expanded in many different ways as a sum of elements in a quantum superposition. Each expansion corresponds to a projection of the wave vector onto a basis. The basis can be chosen at will. Choosing an expansion where the resulting basis elements interact with the environment in an element-specific way, such elements will—with overwhelming probability—be rapidly separated from each other by their natural unitary time evolution along their own independent paths. After a very short interaction, there is almost no chance of further interference. The process is effectively irreversible. The different elements effectively become "lost" from each other in the expanded phase space created by coupling with the environment. In phase space, this decoupling is monitored through the Wigner quasi-probability distribution. The original elements are said to have decohered. The environment has effectively selected out those expansions or decompositions of the original state vector that decohere (or lose phase coherence) with each other. This is called "environmentally-induced superselection", or einselection. [26] The decohered elements of the system no longer exhibit quantum interference between each other, as in a double-slit experiment. Any elements that decohere from each other via environmental interactions are said to be quantum-entangled with the environment. The converse is not true: not all entangled states are decohered from each other.
Any measuring device or apparatus acts as an environment, since at some stage along the measuring chain, it has to be large enough to be read by humans. It must possess a very large number of hidden degrees of freedom. In effect, the interactions may be considered to be quantum measurements. As a result of an interaction, the wave functions of the system and the measuring device become entangled with each other. Decoherence happens when different portions of the system's wave function become entangled in different ways with the measuring device. For two einselected elements of the entangled system's state to interfere, both the original system and the measuring in both elements device must significantly overlap, in the scalar product sense. If the measuring device has many degrees of freedom, it is very unlikely for this to happen.
As a consequence, the system behaves as a classical statistical ensemble of the different elements rather than as a single coherent quantum superposition of them. From the perspective of each ensemble member's measuring device, the system appears to have irreversibly collapsed onto a state with a precise value for the measured attributes, relative to that element. This provides one explanation of how the Born rule coefficients effectively act as probabilities as per the measurement postulate constituting a solution to the quantum measurement problem.
Using Dirac notation, let the system initially be in the state
where the s form an einselected basis (environmentally induced selected eigenbasis [26] ), and let the environment initially be in the state . The vector basis of the combination of the system and the environment consists of the tensor products of the basis vectors of the two subsystems. Thus, before any interaction between the two subsystems, the joint state can be written as
where is shorthand for the tensor product . There are two extremes in the way the system can interact with its environment: either (1) the system loses its distinct identity and merges with the environment (e.g. photons in a cold, dark cavity get converted into molecular excitations within the cavity walls), or (2) the system is not disturbed at all, even though the environment is disturbed (e.g. the idealized non-disturbing measurement). In general, an interaction is a mixture of these two extremes that we examine.
If the environment absorbs the system, each element of the total system's basis interacts with the environment such that
and so
The unitarity of time evolution demands that the total state basis remains orthonormal, i.e. the scalar or inner products of the basis vectors must vanish, since :
This orthonormality of the environment states is the defining characteristic required for einselection. [26]
In an idealized measurement, the system disturbs the environment, but is itself undisturbed by the environment. In this case, each element of the basis interacts with the environment such that
and so
In this case, unitarity demands that
where was used. Additionally, decoherence requires, by virtue of the large number of hidden degrees of freedom in the environment, that
As before, this is the defining characteristic for decoherence to become einselection. [26] The approximation becomes more exact as the number of environmental degrees of freedom affected increases.
Note that if the system basis were not an einselected basis, then the last condition is trivial, since the disturbed environment is not a function of , and we have the trivial disturbed environment basis . This would correspond to the system basis being degenerate with respect to the environmentally defined measurement observable. For a complex environmental interaction (which would be expected for a typical macroscale interaction) a non-einselected basis would be hard to define.
The utility of decoherence lies in its application to the analysis of probabilities, before and after environmental interaction, and in particular to the vanishing of quantum interference terms after decoherence has occurred. If we ask what is the probability of observing the system making a transition from to before has interacted with its environment, then application of the Born probability rule states that the transition probability is the squared modulus of the scalar product of the two states:
where , , and etc.
The above expansion of the transition probability has terms that involve ; these can be thought of as representing interference between the different basis elements or quantum alternatives. This is a purely quantum effect and represents the non-additivity of the probabilities of quantum alternatives.
To calculate the probability of observing the system making a quantum leap from to after has interacted with its environment, then application of the Born probability rule states that we must sum over all the relevant possible states of the environment before squaring the modulus:
The internal summation vanishes when we apply the decoherence/einselection condition , and the formula simplifies to
If we compare this with the formula we derived before the environment introduced decoherence, we can see that the effect of decoherence has been to move the summation sign from inside of the modulus sign to outside. As a result, all the cross- or quantum interference-terms
have vanished from the transition-probability calculation. The decoherence has irreversibly converted quantum behaviour (additive probability amplitudes) to classical behaviour (additive probabilities). [26] [27] [28] However, Ballentine [29] shows that the significant impact of decoherence to reduce interference need not have significance for the transition of quantum systems to classical limits.
In terms of density matrices, the loss of interference effects corresponds to the diagonalization of the "environmentally traced-over" density matrix. [26]
The effect of decoherence on density matrices is essentially the decay or rapid vanishing of the off-diagonal elements of the partial trace of the joint system's density matrix, i.e. the trace, with respect to any environmental basis, of the density matrix of the combined system and its environment. The decoherence irreversibly converts the "averaged" or "environmentally traced-over" [26] density matrix from a pure state to a reduced mixture; it is this that gives the appearance of wave-function collapse. Again, this is called "environmentally induced superselection", or einselection. [26] The advantage of taking the partial trace is that this procedure is indifferent to the environmental basis chosen.
Initially, the density matrix of the combined system can be denoted as
where is the state of the environment. Then if the transition happens before any interaction takes place between the system and the environment, the environment subsystem has no part and can be traced out, leaving the reduced density matrix for the system:
Now the transition probability will be given as
where , , and etc.
Now the case when transition takes place after the interaction of the system with the environment. The combined density matrix will be
To get the reduced density matrix of the system, we trace out the environment and employ the decoherence/einselection condition and see that the off-diagonal terms vanish (a result obtained by Erich Joos and H. D. Zeh in 1985): [30]
Similarly, the final reduced density matrix after the transition will be
The transition probability will then be given as
which has no contribution from the interference terms
The density-matrix approach has been combined with the Bohmian approach to yield a reduced-trajectory approach, taking into account the system reduced density matrix and the influence of the environment. [31]
Consider a system S and environment (bath) B, which are closed and can be treated quantum-mechanically. Let and be the system's and bath's Hilbert spaces respectively. Then the Hamiltonian for the combined system is
where are the system and bath Hamiltonians respectively, is the interaction Hamiltonian between the system and bath, and are the identity operators on the system and bath Hilbert spaces respectively. The time-evolution of the density operator of this closed system is unitary and, as such, is given by
where the unitary operator is . If the system and bath are not entangled initially, then we can write . Therefore, the evolution of the system becomes
The system–bath interaction Hamiltonian can be written in a general form as
where is the operator acting on the combined system–bath Hilbert space, and are the operators that act on the system and bath respectively. This coupling of the system and bath is the cause of decoherence in the system alone. To see this, a partial trace is performed over the bath to give a description of the system alone:
is called the reduced density matrix and gives information about the system only. If the bath is written in terms of its set of orthogonal basis kets, that is, if it has been initially diagonalized, then . Computing the partial trace with respect to this (computational) basis gives
where are defined as the Kraus operators and are represented as (the index combines indices and ):
This is known as the operator-sum representation (OSR). A condition on the Kraus operators can be obtained by using the fact that ; this then gives
This restriction determines whether decoherence will occur or not in the OSR. In particular, when there is more than one term present in the sum for , then the dynamics of the system will be non-unitary, and hence decoherence will take place.
A more general consideration for the existence of decoherence in a quantum system is given by the master equation, which determines how the density matrix of the system alone evolves in time (see also the Belavkin equation [32] [33] [34] for the evolution under continuous measurement). This uses the Schrödinger picture, where evolution of the state (represented by its density matrix) is considered. The master equation is
where is the system Hamiltonian along with a (possible) unitary contribution from the bath, and is the Lindblad decohering term. [2] The Lindblad decohering term is represented as
The are basis operators for the M-dimensional space of bounded operators that act on the system Hilbert space and are the error generators. [35] The matrix elements represent the elements of a positive semi-definite Hermitian matrix; they characterize the decohering processes and, as such, are called the noise parameters. [35] The semigroup approach is particularly nice, because it distinguishes between the unitary and decohering (non-unitary) processes, which is not the case with the OSR. In particular, the non-unitary dynamics are represented by , whereas the unitary dynamics of the state are represented by the usual Heisenberg commutator. Note that when , the dynamical evolution of the system is unitary. The conditions for the evolution of the system density matrix to be described by the master equation are: [2]
Decoherence can be modelled as a non-unitary process by which a system couples with its environment (although the combined system plus environment evolves in a unitary fashion). [2] Thus the dynamics of the system alone, treated in isolation, are non-unitary and, as such, are represented by irreversible transformations acting on the system's Hilbert space . Since the system's dynamics are represented by irreversible representations, then any information present in the quantum system can be lost to the environment or heat bath. Alternatively, the decay of quantum information caused by the coupling of the system to the environment is referred to as decoherence. [1] Thus decoherence is the process by which information of a quantum system is altered by the system's interaction with its environment (which form a closed system), hence creating an entanglement between the system and heat bath (environment). As such, since the system is entangled with its environment in some unknown way, a description of the system by itself cannot be made without also referring to the environment (i.e. without also describing the state of the environment).
Consider a system of N qubits that is coupled to a bath symmetrically. Suppose this system of N qubits undergoes a rotation around the eigenstates of . Then under such a rotation, a random phase will be created between the eigenstates , of . Thus these basis qubits and will transform in the following way:
This transformation is performed by the rotation operator
Since any qubit in this space can be expressed in terms of the basis qubits, then all such qubits will be transformed under this rotation. Consider the th qubit in a pure state where . Before application of the rotation this state is:
This state will decohere, since it is not ‘encoded’ with (dependent upon) the dephasing factor . This can be seen by examining the density matrix averaged over the random phase :
where is a probability measure of the random phase, . Although not entirely necessary, let us assume for simplicity that this is given by the Gaussian distribution, i.e., where represents the spread of the random phase. Then the density matrix computed as above is
Observe that the off-diagonal elements—the coherence terms—decay as the spread of the random phase, , increases over time (which is a realistic expectation). Thus the density matrices for each qubit of the system become indistinguishable over time. This means that no measurement can distinguish between the qubits, thus creating decoherence between the various qubit states. In particular, this dephasing process causes the qubits to collapse to one of the pure states in . This is why this type of decoherence process is called collective dephasing, because the mutual phases between all qubits of the N-qubit system are destroyed.
Depolarizing is a non-unitary transformation on a quantum system which maps pure states to mixed states. This is a non-unitary process because any transformation that reverses this process will map states out of their respective Hilbert space thus not preserving positivity (i.e. the original probabilities are mapped to negative probabilities, which is not allowed). The 2-dimensional case of such a transformation would consist of mapping pure states on the surface of the Bloch sphere to mixed states within the Bloch sphere. This would contract the Bloch sphere by some finite amount and the reverse process would expand the Bloch sphere, which cannot happen.
Dissipation is a decohering process by which the populations of quantum states are changed due to entanglement with a bath. An example of this would be a quantum system that can exchange its energy with a bath through the interaction Hamiltonian. If the system is not in its ground state and the bath is at a temperature lower than that of the system's, then the system will give off energy to the bath, and thus higher-energy eigenstates of the system Hamiltonian will decohere to the ground state after cooling and, as such, will all be non-degenerate. Since the states are no longer degenerate, they are not distinguishable, and thus this process is irreversible (non-unitary).
Decoherence represents an extremely fast process for macroscopic objects, since these are interacting with many microscopic objects, with an enormous number of degrees of freedom in their natural environment. The process is needed if we are to understand why we tend not to observe quantum behavior in everyday macroscopic objects and why we do see classical fields emerge from the properties of the interaction between matter and radiation for large amounts of matter. The time taken for off-diagonal components of the density matrix to effectively vanish is called the decoherence time. It is typically extremely short for everyday, macroscale processes. [26] [27] [28] A modern basis-independent definition of the decoherence time relies on the short-time behavior of the fidelity between the initial and the time-dependent state [36] or, equivalently, the decay of the purity. [37]
Assume for the moment that the system in question consists of a subsystem A being studied and the "environment" , and the total Hilbert space is the tensor product of a Hilbert space describing A and a Hilbert space describing , that is,
This is a reasonably good approximation in the case where A and are relatively independent (e.g. there is nothing like parts of A mixing with parts of or conversely). The point is, the interaction with the environment is for all practical purposes unavoidable (e.g. even a single excited atom in a vacuum would emit a photon, which would then go off). Let's say this interaction is described by a unitary transformation U acting upon . Assume that the initial state of the environment is , and the initial state of A is the superposition state
where and are orthogonal, and there is no entanglement initially. Also, choose an orthonormal basis for . (This could be a "continuously indexed basis" or a mixture of continuous and discrete indexes, in which case we would have to use a rigged Hilbert space and be more careful about what we mean by orthonormal, but that's an inessential detail for expository purposes.) Then, we can expand
and
uniquely as
and
respectively. One thing to realize is that the environment contains a huge number of degrees of freedom, a good number of them interacting with each other all the time. This makes the following assumption reasonable in a handwaving way, which can be shown to be true in some simple toy models. Assume that there exists a basis for such that and are all approximately orthogonal to a good degree if i ≠ j and the same thing for and and also for and for any i and j (the decoherence property).
This often turns out to be true (as a reasonable conjecture) in the position basis because how A interacts with the environment would often depend critically upon the position of the objects in A. Then, if we take the partial trace over the environment, we would find the density state[ clarification needed ] is approximately described by
that is, we have a diagonal mixed state, there is no constructive or destructive interference, and the "probabilities" add up classically. The time it takes for U(t) (the unitary operator as a function of time) to display the decoherence property is called the decoherence time.
The decoherence rate depends on a number of factors, including temperature or uncertainty in position, and many experiments have tried to measure it depending on the external environment. [38]
The process of a quantum superposition gradually obliterated by decoherence was quantitatively measured for the first time by Serge Haroche and his co-workers at the École Normale Supérieure in Paris in 1996. [39] Their approach involved sending individual rubidium atoms, each in a superposition of two states, through a microwave-filled cavity. The two quantum states both cause shifts in the phase of the microwave field, but by different amounts, so that the field itself is also put into a superposition of two states. Due to photon scattering on cavity-mirror imperfection, the cavity field loses phase coherence to the environment. Haroche and his colleagues measured the resulting decoherence via correlations between the states of pairs of atoms sent through the cavity with various time delays between the atoms.
In July 2011, researchers from University of British Columbia and University of California, Santa Barbara showed that applying high magnetic fields to single molecule magnets suppressed two of three known sources of decoherence. [40] [41] [42] They were able to measure the dependence of decoherence on temperature and magnetic field strength.
Decoherence causes the system to lose its quantumness, which invalidates the superposition principle and turns 'quantum' to 'classical'. [43] It is a major challenge in quantum computing.
A real quantum system inevitably meets the surrounding environment, the interaction shows up as noise in physical process. It's extremely sensitive to environmental noise—such as electromagnetic fields, temperature fluctuations, and other external perturbations—as well as measurement, lead to decoherence.
Decoherence is a challenge for the practical realization of quantum computers, since such machines are expected to rely heavily on the undisturbed evolution of quantum coherences. They require that the coherence of states be preserved and that decoherence be managed, in order to actually perform quantum computation. Because of decoherence, we need to finish the quantum process before the qubit state is decayed. [44]
The physical quantity coherence time is defined as the time that the quantum state holds its superposition principle.
The purpose of against decoherence is to extend the coherence time of quantum systems. It will improve the stability of the computing of information. [45]
Researchers have developed many methods and tools to mitigate or eliminate the negative influences from decoherence. Several typical ways are listed below.
The most basic and direct way to reduce decoherence is to prevent the quantum system from interacting with the environment by any type of isolation. Here are some typical examples about this.
This article only references primary sources.(December 2024) |
One of the most powerful tools for combating quantum decoherence is Quantum Error Correction (QEC). QEC schemes encode quantum information redundantly across multiple physical qubits, allowing for the detection and correction of errors without directly measuring the quantum state. These QEC protocols rely on the assumption that errors affect only a small fraction of qubits at any given time, enabling the detection and correction of errors through redundant encoding. Here are some representative QEC protocols.
However, QEC comes at a significant cost: it requires a large number of physical qubits to encode a single logical qubit, and fault-tolerant error correction methods introduce additional computational overhead.
Dynamical Decoupling (DD) is another typical quantum control technique used against decoherence, especially for systems that are coupled to noisy environments. DD involves applying an external sequence of control pulses to the quantum system at strategically timed intervals to average out environmental interactions. This technique effectively manipulates the irreversible component of quantum systems interact with surrounding environment by the external controllable interactions [52] . Dynamical decoupling has been experimentally demonstrated in various systems, including trapped ions [53] and superconducting qubits [54] . Here are some examples of representative sequences.
Quantum teleportation is a technique for transferring quantum information from a sender at one location to a receiver some distance away. While teleportation is commonly portrayed in science fiction as a means to transfer physical objects from one location to the next, quantum teleportation only transfers quantum information. The sender does not have to know the particular quantum state being transferred. Moreover, the location of the recipient can be unknown, but to complete the quantum teleportation, classical information needs to be sent from sender to receiver. Because classical information needs to be sent, quantum teleportation cannot occur faster than the speed of light.
In quantum mechanics, a density matrix is a matrix that describes an ensemble of physical systems as quantum states. It allows for the calculation of the probabilities of the outcomes of any measurements performed upon the systems of the ensemble using the Born rule. It is a generalization of the more usual state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed ensembles. Mixed ensembles arise in quantum mechanics in two different situations:
In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden-variable theories. Experimental verification of the inequality being violated is seen as confirmation that nature cannot be described by such theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Stewart Bell's original inequality, is a constraint—on the statistical occurrence of "coincidences" in a Bell test—which is necessarily true if an underlying local hidden-variable theory exists. In practice, the inequality is routinely violated by modern experiments in quantum mechanics.
An operator is a function over a space of physical states onto another space of states. The simplest example of the utility of operators is the study of symmetry. Because of this, they are useful tools in classical mechanics. Operators are even more important in quantum mechanics, where they form an intrinsic part of the formulation of the theory.
In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.
In quantum mechanics and computing, the Bloch sphere is a geometrical representation of the pure state space of a two-level quantum mechanical system (qubit), named after the physicist Felix Bloch.
Quantum error correction (QEC) is a set of techniques used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is theorised as essential to achieve fault tolerant quantum computing that can reduce the effects of noise on stored quantum information, faulty quantum gates, faulty quantum state preparation, and faulty measurements. Effective quantum error correction would allow quantum computers with low qubit fidelity to execute algorithms of higher complexity or greater circuit depth.
In quantum information theory, a quantum channel is a communication channel which can transmit quantum information, as well as classical information. An example of quantum information is the general dynamics of a qubit. An example of classical information is a text document transmitted over the Internet.
In quantum mechanics, einselections, short for "environment-induced superselection", is a name coined by Wojciech H. Zurek for a process which is claimed to explain the appearance of wavefunction collapse and the emergence of classical descriptions of reality from quantum descriptions. In this approach, classicality is described as an emergent property induced in open quantum systems by their environments. Due to the interaction with the environment, the vast majority of states in the Hilbert space of a quantum open system become highly unstable due to entangling interaction with the environment, which in effect monitors selected observables of the system. After a decoherence time, which for macroscopic objects is typically many orders of magnitude shorter than any other dynamical timescale, a generic quantum state decays into an uncertain state which can be expressed as a mixture of simple pointer states. In this way the environment induces effective superselection rules. Thus, einselection precludes stable existence of pure superpositions of pointer states. These 'pointer states' are stable despite environmental interaction. The einselected states lack coherence, and therefore do not exhibit the quantum behaviours of entanglement and superposition.
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is
LOCC, or local operations and classical communication, is a method in quantum information theory where a local (product) operation is performed on part of the system, and where the result of that operation is "communicated" classically to another part where usually another local operation is performed conditioned on the information received.
In functional analysis and quantum information science, a positive operator-valued measure (POVM) is a measure whose values are positive semi-definite operators on a Hilbert space. POVMs are a generalization of projection-valued measures (PVM) and, correspondingly, quantum measurements described by POVMs are a generalization of quantum measurement described by PVMs.
The time-evolving block decimation (TEBD) algorithm is a numerical scheme used to simulate one-dimensional quantum many-body systems, characterized by at most nearest-neighbour interactions. It is dubbed Time-evolving Block Decimation because it dynamically identifies the relevant low-dimensional Hilbert subspaces of an exponentially larger original Hilbert space. The algorithm, based on the Matrix Product States formalism, is highly efficient when the amount of entanglement in the system is limited, a requirement fulfilled by a large class of quantum many-body systems in one dimension.
In quantum mechanics, the expectation value is the probabilistic expected value of the result (measurement) of an experiment. It can be thought of as an average of all the possible outcomes of a measurement as weighted by their likelihood, and as such it is not the most probable value of a measurement; indeed the expectation value may have zero probability of occurring. It is a fundamental concept in all areas of quantum physics.
In 1927, a year after the publication of the Schrödinger equation, Hartree formulated what are now known as the Hartree equations for atoms, using the concept of self-consistency that Lindsay had introduced in his study of many electron systems in the context of Bohr theory. Hartree assumed that the nucleus together with the electrons formed a spherically symmetric field. The charge distribution of each electron was the solution of the Schrödinger equation for an electron in a potential , derived from the field. Self-consistency required that the final field, computed from the solutions, was self-consistent with the initial field, and he thus called his method the self-consistent field method.
A decoherence-free subspace (DFS) is a subspace of a quantum system's Hilbert space that is invariant to non-unitary dynamics. Alternatively stated, they are a small section of the system Hilbert space where the system is decoupled from the environment and thus its evolution is completely unitary. DFSs can also be characterized as a special class of quantum error correcting codes. In this representation they are passive error-preventing codes since these subspaces are encoded with information that (possibly) won't require any active stabilization methods. These subspaces prevent destructive environmental interactions by isolating quantum information. As such, they are an important subject in quantum computing, where (coherent) control of quantum systems is the desired goal. Decoherence creates problems in this regard by causing loss of coherence between the quantum states of a system and therefore the decay of their interference terms, thus leading to loss of information from the (open) quantum system to the surrounding environment. Since quantum computers cannot be isolated from their environment and information can be lost, the study of DFSs is important for the implementation of quantum computers into the real world.
Entanglement distillation is the transformation of N copies of an arbitrary entangled state into some number of approximately pure Bell pairs, using only local operations and classical communication.
In quantum computation, the Hadamard test is a method used to create a random variable whose expected value is the expected real part , where is a quantum state and is a unitary gate acting on the space of . The Hadamard test produces a random variable whose image is in and whose expected value is exactly . It is possible to modify the circuit to produce a random variable whose expected value is by applying an gate after the first Hadamard gate.
The swap test is a procedure in quantum computation that is used to check how much two quantum states differ, appearing first in the work of Barenco et al. and later rediscovered by Harry Buhrman, Richard Cleve, John Watrous, and Ronald de Wolf. It appears commonly in quantum machine learning, and is a circuit used for proofs-of-concept in implementations of quantum computers.
In quantum information theory and quantum optics, the Schrödinger–HJW theorem is a result about the realization of a mixed state of a quantum system as an ensemble of pure quantum states and the relation between the corresponding purifications of the density operators. The theorem is named after physicists and mathematicians Erwin Schrödinger, Lane P. Hughston, Richard Jozsa and William Wootters. The result was also found independently by Nicolas Gisin, and by Nicolas Hadjisavvas building upon work by Ed Jaynes, while a significant part of it was likewise independently discovered by N. David Mermin. Thanks to its complicated history, it is also known by various other names such as the GHJW theorem, the HJW theorem, and the purification theorem.
{{cite book}}
: CS1 maint: location missing publisher (link)Joos and Zeh (1985) state 'Of course no unitary treatment of the time dependence can explain why only one of these dynamically independent components is experienced'. And in a recent review on decoherence, Joos (1999) states 'Does decoherence solve the measurement problem? Clearly not. What decoherence tells us is that certain objects appear classical when observed. But what is an observation? At some stage we still have to apply the usual probability rules of quantum theory'.
Our theory also predicted that we could suppress the decoherence, and push the decoherence rate in the experiment to levels far below the threshold necessary for quantum information processing, by applying high magnetic fields. (...)Magnetic molecules now suddenly appear to have serious potential as candidates for quantum computing hardware", said Susumu Takahashi, assistant professor of chemistry and physics at the University of Southern California. "This opens up a whole new area of experimental investigation with sizeable potential in applications, as well as for fundamental work".