Noiseless subsystems

Last updated

The framework of noiseless subsystems has been developed as a tool to preserve fragile quantum information against decoherence. [1] [2] [3] [4] In brief, when a quantum register (a Hilbert space) is subjected to decoherence due to an interaction with an external and uncontrollable environment, information stored in the register is, in general, degraded. It has been shown that when the source of decoherence exhibits some symmetries, certain subsystems of the quantum register are unaffected by the interactions with the environment and are thus noiseless. These noiseless subsystems are therefore very natural and robust tools that can be used for processing quantum information.

See also

Related Research Articles

<span class="mw-page-title-main">Quantum Zeno effect</span> Quantum measurement phenomenon

The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle's time evolution to be slowed down by measuring it frequently enough with respect to some chosen measurement setting.

The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. For a bipartite state , the conditional entropy is written , or , depending on the notation being used for the von Neumann entropy. The quantum conditional entropy was defined in terms of a conditional density operator by Nicolas Cerf and Chris Adami, who showed that quantum conditional entropies can be negative, something that is forbidden in classical physics. The negativity of quantum conditional entropy is a sufficient criterion for quantum non-separability.

In quantum mechanics, einselections, short for "environment-induced superselection", is a name coined by Wojciech H. Zurek for a process which is claimed to explain the appearance of wavefunction collapse and the emergence of classical descriptions of reality from quantum descriptions. In this approach, classicality is described as an emergent property induced in open quantum systems by their environments. Due to the interaction with the environment, the vast majority of states in the Hilbert space of a quantum open system become highly unstable due to entangling interaction with the environment, which in effect monitors selected observables of the system. After a decoherence time, which for macroscopic objects is typically many orders of magnitude shorter than any other dynamical timescale, a generic quantum state decays into an uncertain state which can be expressed as a mixture of simple pointer states. In this way the environment induces effective superselection rules. Thus, einselection precludes stable existence of pure superpositions of pointer states. These 'pointer states' are stable despite environmental interaction. The einselected states lack coherence, and therefore do not exhibit the quantum behaviours of entanglement and superposition.

Quantum Darwinism is a theory meant to explain the emergence of the classical world from the quantum world as due to a process of Darwinian natural selection induced by the environment interacting with the quantum system; where the many possible quantum states are selected against in favor of a stable pointer state. It was proposed in 2003 by Wojciech Zurek and a group of collaborators including Ollivier, Poulin, Paz and Blume-Kohout. The development of the theory is due to the integration of a number of Zurek's research topics pursued over the course of 25 years, including pointer states, einselection and decoherence.

<span class="mw-page-title-main">Artur Ekert</span> Polish-British physicist (born 1961)

Artur Konrad Ekert is a British-Polish professor of quantum physics at the Mathematical Institute, University of Oxford, professorial fellow in quantum physics and cryptography at Merton College, Oxford, Lee Kong Chian Centennial Professor at the National University of Singapore and the founding director of the Centre for Quantum Technologies (CQT). His research interests extend over most aspects of information processing in quantum-mechanical systems, with a focus on quantum communication and quantum computation. He is best known as one of the pioneers of quantum cryptography.

An atom interferometer is an interferometer which uses the wave character of atoms. Similar to optical interferometers, atom interferometers measure the difference in phase between atomic matter waves along different paths. Today, atomic interference is typically controlled with laser beams. Atom interferometers have many uses in fundamental physics including measurements of the gravitational constant, the fine-structure constant, the universality of free fall, and have been proposed as a method to detect gravitational waves. They also have applied uses as accelerometers, rotation sensors, and gravity gradiometers.

<span class="mw-page-title-main">Topological order</span> Type of order at absolute zero

In physics, topological order is a kind of order in the zero-temperature phase of matter. Macroscopically, topological order is defined and described by robust ground state degeneracy and quantized non-Abelian geometric phases of degenerate ground states. Microscopically, topological orders correspond to patterns of long-range quantum entanglement. States with different topological orders cannot change into each other without a phase transition.

Quantum metrology is the study of making high-resolution and highly sensitive measurements of physical parameters using quantum theory to describe the physical systems, particularly exploiting quantum entanglement and quantum squeezing. This field promises to develop measurement techniques that give better precision than the same measurement performed in a classical framework. Together with quantum hypothesis testing, it represents an important theoretical model at the basis of quantum sensing.

Quantum cloning is a process that takes an arbitrary, unknown quantum state and makes an exact copy without altering the original state in any way. Quantum cloning is forbidden by the laws of quantum mechanics as shown by the no cloning theorem, which states that there is no operation for cloning any arbitrary state perfectly. In Dirac notation, the process of quantum cloning is described by:

Daniel Amihud Lidar is the holder of the Viterbi Professorship of Engineering at the University of Southern California, where he is a professor of electrical engineering, chemistry, physics and astronomy. He is the director and co-founder of the USC Center for Quantum Information Science & Technology (CQIST) as well as scientific director of the USC-Lockheed Martin Quantum Computing Center, notable for his research on control of quantum systems and quantum information processing.

In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum physical effects but do not necessarily involve quantum entanglement.

Within quantum cryptography, the Decoy state quantum key distribution (QKD) protocol is the most widely implemented QKD scheme. Practical QKD systems use multi-photon sources, in contrast to the standard BB84 protocol, making them susceptible to photon number splitting (PNS) attacks. This would significantly limit the secure transmission rate or the maximum channel length in practical QKD systems. In decoy state technique, this fundamental weakness of practical QKD systems is addressed by using multiple intensity levels at the transmitter's source, i.e. qubits are transmitted by Alice using randomly chosen intensity levels, resulting in varying photon number statistics throughout the channel. At the end of the transmission Alice announces publicly which intensity level has been used for the transmission of each qubit. A successful PNS attack requires maintaining the bit error rate (BER) at the receiver's end, which can not be accomplished with multiple photon number statistics. By monitoring BERs associated with each intensity level, the two legitimate parties will be able to detect a PNS attack, with highly increased secure transmission rates or maximum channel lengths, making QKD systems suitable for practical applications.

Dynamical decoupling (DD) is an open-loop quantum control technique employed in quantum computing to suppress decoherence by taking advantage of rapid, time-dependent control modulation. In its simplest form, DD is implemented by periodic sequences of instantaneous control pulses, whose net effect is to approximately average the unwanted system-environment coupling to zero. Different schemes exist for designing DD protocols that use realistic bounded-strength control pulses, as well as for achieving high-order error suppression, and for making DD compatible with quantum gates. In spin systems in particular, commonly used protocols for dynamical decoupling include the Carr-Purcell and the Carr-Purcell-Meiboom-Gill schemes. They are based on the Hahn spin echo technique of applying periodic pulses to enable refocusing and hence extend the coherence times of qubits.

Quantum illumination is a paradigm for target detection that employs quantum entanglement between a signal electromagnetic mode and an idler electromagnetic mode, as well as joint measurement of these modes. The signal mode is propagated toward a region of space, and it is either lost or reflected, depending on whether a target is absent or present, respectively. In principle, quantum illumination can be beneficial even if the original entanglement is completely destroyed by a lossy and noisy environment.

<span class="mw-page-title-main">Julia Kempe</span> French, German, and Israeli researcher in quantum computing

Julia Kempe is a French, German, and Israeli researcher in quantum computing. She is currently the Director of the Center for Data Science at NYU and Professor at the Courant Institute.

Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.

In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.

Katherine Birgitta Whaley is a professor of chemistry at the University of California Berkeley and a senior faculty scientist in the Division of Chemical Sciences at Lawrence Berkeley National Laboratory. At UC Berkeley, Whaley is the director of the Berkeley Quantum Information and Computation Center, a member of the executive board for the Center for Quantum Coherent Science, and a member of the Kavli Energy Nanosciences Institute. At Lawrence Berkeley National Laboratory, Whaley is a member of the Quantum Algorithms Team for Chemical Sciences in the research area of resource-efficient algorithms.

The Eastin–Knill theorem is a no-go theorem that states: "No quantum error correcting code can have a continuous symmetry which acts transversely on physical qubits". In other words, no quantum error correcting code can transversely implement a universal gate set, where a transversal logical gate is one that can be implemented on a logical qubit by the independent action of separate physical gates on corresponding physical qubits.

Bound entanglement is a weak form of quantum entanglement, from which no singlets can be distilled with local operations and classical communication (LOCC).

References

  1. Zanardi, P.; Rasetti, M. (1997), "Noiseless quantum codes", Physical Review Letters, 79 (17): 3306–3309, arXiv: quant-ph/9705044 , Bibcode:1997PhRvL..79.3306Z, doi:10.1103/physrevlett.79.3306, S2CID   44477408
  2. Lidar, D. A.; Chuang, I. L.; Whaley, K. B. (1998), "Decoherence-free subspaces for quantum computation", Physical Review Letters , 81 (12): 2594–2597, arXiv: quant-ph/9807004 , Bibcode:1998PhRvL..81.2594L, doi:10.1103/physrevlett.81.2594, S2CID   13979882
  3. Knill, Emanuel; Laflamme, Raymond; Viola, Lorenza (2000), "Theory of quantum error correction for general noise", Physical Review Letters , 84 (11): 2525–2528, arXiv: quant-ph/9604034 , Bibcode:2000PhRvL..84.2525K, doi:10.1103/PhysRevLett.84.2525, MR   1745959, PMID   11018926, S2CID   119102213
  4. Kempe, J.; Bacon, D.; Lidar, D. A.; Whaley, K. B. (2001), "Theory of decoherence-free fault-tolerant universal quantum computation", Physical Review A , 63 (4): 042307, arXiv: quant-ph/0004064 , Bibcode:2001PhRvA..63d2307K, doi:10.1103/physreva.63.042307, S2CID   44200695