The toric code is a topological quantum error correcting code, and an example of a stabilizer code, defined on a two-dimensional spin latticeIt is the simplest and most well studied of the quantum double models. It is also the simplest example of topological order—Z2 topological order (first studied in the context of Z2 spin liquid in 1991). The toric code can also be considered to be a Z2 lattice gauge theory in a particular limit. It was introduced by Alexei Kitaev.
The toric code gets its name from its periodic boundary conditions, giving it the shape of a torus. These conditions give the model translational invariance, which is useful for analytic study. However, experimental realization requires open boundary conditions, allowing the system to be embedded on a 2D surface. The resulting code is typically known as the planar code. This has identical behaviour to the toric code in most, but not all, cases.
The toric code is defined on a two-dimensional lattice, usually chosen to be the square lattice, with a spin-½ degree of freedom located on each edge. They are chosen to be periodic. Stabilizer operators are defined on the spins around each vertex and plaquette[ definition needed ] (or face ie. a vertex of the dual lattice)[ clarification needed ] of the lattice as follows,
Where here we use to denote the edges touching the vertex , and to denote the edges surrounding the plaquette . The stabilizer space of the code is that for which all stabilizers act trivially, hence,
for any state . For the toric code, this space is four-dimensional, and so can be used to store two qubits of quantum information. This can be proven by considering the number of independent stabilizer operators. The occurrence of errors will move the state out of the stabilizer space, resulting in vertices and plaquettes for which the above condition does not hold. The positions of these violations is the syndrome of the code, which can be used for error correction.
The unique nature of the topological codes, such as the toric code, is that stabilizer violations can be interpreted as quasiparticles. Specifically, if the code is in a state such that,
a quasiparticle known as an anyon can be said to exist on the vertex . Similarly violations of the are associated with so called anyons on the plaquettes. The stabilizer space therefore corresponds to the anyonic vacuum. Single spin errors cause pairs of anyons to be created and transported around the lattice.
When errors create an anyon pair and move the anyons, one can imagine a path connecting the two composed of all links acted upon. If the anyons then meet and are annihilated, this path describes a loop. If the loop is topologically trivial, it has no effect on the stored information. The annihilation of the anyons, in this case, corrects all of the errors involved in their creation and transport. However, if the loop is topologically non-trivial, though re-annihilation of the anyons returns the state to the stabilizer space it also implements a logical operation on the stored information. The errors, in this case, are therefore not corrected but consolidated.
Consider the noise model for which bit and phase errors occur independently on each spin, both with probability p. When p is low, this will create sparsely distributed pairs of anyons which have not moved far from their point of creation. Correction can be achieved by identifying the pairs that the anyons were created in (up to an equivalence class), and then re-annihilating them to remove the errors. As p increases, however, it becomes more ambiguous as to how the anyons may be paired without risking the formation of topologically non-trivial loops. This gives a threshold probability, under which the error correction will almost certainly succeed. Through a mapping to the random-bond Ising model, this critical probability has been found to be around 11%.
Other error models may also be considered, and thresholds found. In all cases studied so far, the code has been found to saturate the Hashing bound. For some error models, such as biased errors where bit errors occur more often than phase errors or vice versa, lattices other than the square lattice must be used to achieve the optimal thresholds.
These thresholds are upper limits and are useless unless efficient algorithms are found to achieve them. The most well-used algorithm is minimum weight perfect matching.When applied to the noise model with independent bit and flip errors, a threshold of around 10.5% is achieved. This falls only a little short of the 11% maximum. However, matching does not work so well when there are correlations between the bit and phase errors, such as with depolarizing noise.
The means to perform quantum computation on logical information stored within the toric code has been considered, with the properties of the code providing fault-tolerance. It has been shown that extending the stabilizer space using 'holes', vertices or plaquettes on which stabilizers are not enforced, allows many qubits to be encoded into the code. However, a universal set of unitary gates cannot be fault-tolerantly implemented by unitary operations and so additional techniques are required to achieve quantum computing. For example, universal quantum computing can be achieved by preparing magic states via encoded quantum stubs called tidBits used to teleport in the required additional gates when replaced as a qubit. Furthermore, preparation of magic states must be fault tolerant, which can be achieved by magic state distillation on noisy magic states. A measurement based scheme for quantum computation based upon this principle has been found, whose error threshold is the highest known for a two-dimensional architecture.
Since the stabilizer operators of the toric code are quasilocal, acting only on spins located near each other on a two-dimensional lattice, it is not unrealistic to define the following Hamiltonian,
The ground state space of this Hamiltonian is the stabilizer space of the code. Excited states correspond to those of anyons, with the energy proportional to their number. Local errors are therefore energetically suppressed by the gap, which has been shown to be stable against local perturbations.However, the dynamic effects of such perturbations can still cause problems for the code.
The gap also gives the code a certain resilience against thermal errors, allowing it to be correctable almost surely for a certain critical time. This time increases with , but since arbitrary increases of this coupling are unrealistic, the protection given by the Hamiltonian still has its limits.
The means to make the toric code, or the planar code, into a fully self-correcting quantum memory is often considered. Self-correction means that the Hamiltonian will naturally suppress errors indefinitely, leading to a lifetime that diverges in the thermodynamic limit. It has been found that this is possible in the toric code only if long range interactions are present between anyons.Proposals have been made for realization of these in the lab Another approach is the generalization of the model to higher dimensions, with self-correction possible in 4D with only quasi-local interactions.
As mentioned above, so called and quasiparticles are associated with the vertices and plaquettes of the model, respectively. These quasiparticles can be described as anyons, due to the non-trivial effect of their braiding. Specifically, though both species of anyons are bosonic with respect to themselves, the braiding of two 's or 's having no effect, a full monodromy of an and an will yield a phase of . Such a result is not consistent with either bosonic or fermionic statistics, and hence is anyonic.
The anyonic mutual statistics of the quasiparticles demonstrate the logical operations performed by topologically non-trivial loops. Consider the creation of a pair of anyons followed by the transport of one around a topologically nontrivial loop, such as that shown on the torus in blue on the figure above, before the pair are reannhilated. The state is returned to the stabilizer space, but the loop implements a logical operation on one of the stored qubits. If anyons are similarly moved through the red loop above a logical operation will also result. The phase of resulting when braiding the anyons shows that these operations do not commute, but rather anticommute. They may therefore be interpreted as logical and Pauli operators on one of the stored qubits. The corresponding logical Pauli's on the other qubit correspond to an anyon following the blue loop and an anyon following the red. No braiding occurs when and pass through parallel paths, the phase of therefore does not arise and the corresponding logical operations commute. This is as should be expected since these form operations acting on different qubits.
Due to the fact that both and anyons can be created in pairs, it is clear to see that both these quasiparticles are their own antiparticles. A composite particle composed of two anyons is therefore equivalent to the vacuum, since the vacuum can yield such a pair and such a pair will annihilate to the vacuum. Accordingly, these composites have bosonic statistics, since their braiding is always completely trivial. A composite of two anyons is similarly equivalent to the vacuum. The creation of such composites is known as the fusion of anyons, and the results can be written in terms of fusion rules. In this case, these take the form,
Where denotes the vacuum. A composite of an and an is not trivial. This therefore constitutes another quasiparticle in the model, sometimes denoted , with fusion rule,
From the braiding statistics of the anyons we see that, since any single exchange of two 's will involve a full monodromy of a constituent and , a phase of will result. This implies fermionic self-statistics for the 's.
The use of a torus is not required to form an error correcting code. Other surfaces may also be used, with their topological properties determining the degeneracy of the stabilizer space. In general, quantum error correcting codes defined on two-dimensional spin lattices according to the principles above are known as surface codes.
It is also possible to define similar codes using higher-dimensional spins. These are the quantum double modelsand string-net models, which allow a greater richness in the behaviour of anyons, and so may be used for more advanced quantum computation and error correction proposals. These not only include models with Abelian anyons, but also those with non-Abelian statistics.
The most explicit demonstration of the properties of the toric code has been in state based approaches. Rather than attempting to realize the Hamiltonian, these simply prepare the code in the stabilizer space. Using this technique, experiments have been able to demonstrate the creation, transport and statistics of the anyons.More recent experiments have also been able to demonstrate the error correction properties of the code.
For realizations of the toric code and its generalizations with a Hamiltonian, much progress has been made using Josephson junctions. The theory of how the Hamiltonians may be implemented has been developed for a wide class of topological codes.An experiment has also been performed, realizing the toric code Hamiltonian for a small lattice, and demonstrating the quantum memory provided by its degenerate ground state.
Other theoretical and experimental works towards realizations are based on cold atoms. A toolkit of methods that may be used to realize topological codes with optical lattices has been explored,as have experiments concerning minimal instances of topological order. Such minimal instances of the toric code has been realized experimentally within isolated square plaquettes. Progress is also being made into simulations of the toric model with Rydberg atoms, in which the Hamiltonian and the effects of dissipative noise can be demonstrated.
Quantum computing is the use of quantum phenomena such as superposition and entanglement to perform computation. Computers that perform quantum computations are known as quantum computers. Quantum computers are believed to be able to solve certain computational problems, such as integer factorization, substantially faster than classical computers. The study of quantum computing is a subfield of quantum information science.
In physics, an anyon is a type of quasiparticle that occurs only in two-dimensional systems, with properties much less restricted than the two kinds of standard elementary particles, fermions and bosons. In general, the operation of exchanging two identical particles, although it may cause a global phase shift, cannot affect observables. Anyons are generally classified as abelian or non-abelian. Abelian anyons play a major role in the fractional quantum Hall effect. Non-abelian anyons have not been definitively detected, although this is an active area of research.
Quantum error correction (QEC) is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements.
The Wannier functions are a complete set of orthogonal functions used in solid-state physics. They were introduced by Gregory Wannier. Wannier functions are the localized molecular orbitals of crystalline systems.
A topological quantum computer is a theoretical quantum computer proposed by Russian-American physicist Alexei Kitaev in 1997. It employs two-dimensional quasiparticles called anyons, whose world lines pass around one another to form braids in a three-dimensional spacetime. These braids form the logic gates that make up the computer. The advantage of a quantum computer based on quantum braids over using trapped quantum particles is that the former is much more stable. Small, cumulative perturbations can cause quantum states to decohere and introduce errors in the computation, but such small perturbations do not change the braids' topological properties. This is like the effort required to cut a string and reattach the ends to form a different braid, as opposed to a ball bumping into a wall.
A Majorana fermion, also referred to as a Majorana particle, is a fermion that is its own antiparticle. They were hypothesised by Ettore Majorana in 1937. The term is sometimes used in opposition to a Dirac fermion, which describes fermions that are not their own antiparticles.
The Bose–Hubbard model gives a description of the physics of interacting spinless bosons on a lattice. It is closely related to the Hubbard model which originated in solid-state physics as an approximate description of superconducting systems and the motion of electrons between the atoms of a crystalline solid. The model was first introduced by Gersch and Knollman in 1963 in the context of granular superconductors. The model rose to prominence in the 1980s after it was found to capture the essence of the superfluid-insulator transition in a way that was much more mathematically tractable than fermionic metal-insulator models.
In condensed matter physics, a string-net is an extended object whose collective behavior has been proposed as a physical mechanism for topological order by Michael A. Levin and Xiao-Gang Wen. A particular string-net model may involve only closed loops; or networks of oriented, labeled strings obeying branching rules given by some gauge group; or still more general networks.
The topological entanglement entropy or topological entropy, usually denoted by , is a number characterizing many-body states that possess topological order.
The one-way or measurement-based quantum computer (MBQC) is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.
In quantum computing, the (quantum) threshold theorem states that a quantum computer with a physical error rate below a certain threshold can, through application of quantum error correction schemes, suppress the logical error rate to arbitrarily low levels. This shows that quantum computers can be made fault-tolerant, as an analogue to von Neumann's threshold theorem for classical computation. This result was proven by the groups of Aharanov and Ben-Or; Knill, Laflamme, and Zurek; and Kitaev independently. These results built off a paper of Shor, which proved a weaker version of the threshold theorem.
Xiao-Gang Wen is a Chinese-American physicist. He is a Cecil and Ida Green Professor of Physics at the Massachusetts Institute of Technology and Distinguished Visiting Research Chair at the Perimeter Institute for Theoretical Physics. His expertise is in condensed matter theory in strongly correlated electronic systems. In Oct. 2016, he was awarded the Oliver E. Buckley Condensed Matter Prize.
Subir Sachdev is Herchel Smith Professor of Physics at Harvard University specializing in condensed matter. He was elected to the U.S. National Academy of Sciences in 2014, and received the Lars Onsager Prize from the American Physical Society and the Dirac Medal from the ICTP in 2018.
In computational complexity theory, QMA, which stands for Quantum Merlin Arthur, is the quantum analog of the nonprobabilistic complexity class NP or the probabilistic complexity class MA. It is related to BQP in the same way NP is related to P, or MA is related to BPP.
In condensed matter physics, a quantum spin liquid is a phase of matter that can be formed by interacting quantum spins in certain magnetic materials. Quantum spin liquids (QSL) are generally characterized by their long-range quantum entanglement, fractionalized excitations, and absence of ordinary magnetic order.
In quantum many-body physics, topological degeneracy is a phenomenon in which the ground state of a gapped many-body Hamiltonian becomes degenerate in the limit of large system size such that the degeneracy cannot be lifted by any local perturbations.
Symmetry-protected topological (SPT) order is a kind of order in zero-temperature quantum-mechanical states of matter that have a symmetry and a finite energy gap.
The term Dirac matter refers to a class of condensed matter systems which can be effectively described by the Dirac equation. Even though the Dirac equation itself was formulated for fermions, the quasi-particles present within Dirac matter can be of any statistics. As a consequence, Dirac matter can be distinguished in fermionic, bosonic or anyonic Dirac matter. Prominent examples of Dirac matter are Graphene, topological insulators, Dirac semimetals, Weyl semimetals, various high-temperature superconductors with -wave pairing and liquid Helium-3. The effective theory of such systems is classified by a specific choice of the Dirac mass, the Dirac velocity, the Dirac matrices and the space-time curvature. The universal treatment of the class of Dirac matter in terms of an effective theory leads to a common features with respect to the density of states, the heat capacity and impurity scattering.
A fracton is an emergent topological quasiparticle excitation which is immobile when in isolation. Many theoretical systems have been proposed in which fractons exist as elementary excitations. Such systems as known as fracton models. Fractons have been identified in various CSS codes as well as in symmetric tensor gauge theories.
In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.