Adiabatic quantum computation (AQC) is a form of quantum computing which relies on the adiabatic theorem to perform calculations [1] and is closely related to quantum annealing. [2] [3] [4] [5]
First, a (potentially complicated) Hamiltonian is found whose ground state describes the solution to the problem of interest. Next, a system with a simple Hamiltonian is prepared and initialized to the ground state. Finally, the simple Hamiltonian is adiabatically evolved to the desired complicated Hamiltonian. By the adiabatic theorem, the system remains in the ground state, so at the end, the state of the system describes the solution to the problem. Adiabatic quantum computing has been shown to be polynomially equivalent to conventional quantum computing in the circuit model. [6]
The time complexity for an adiabatic algorithm is the time taken to complete the adiabatic evolution which is dependent on the gap in the energy eigenvalues (spectral gap) of the Hamiltonian. Specifically, if the system is to be kept in the ground state, the energy gap between the ground state and the first excited state of provides an upper bound on the rate at which the Hamiltonian can be evolved at time . [7] When the spectral gap is small, the Hamiltonian has to be evolved slowly. The runtime for the entire algorithm can be bounded by:
where is the minimum spectral gap for .
AQC is a possible method to get around the problem of energy relaxation. Since the quantum system is in the ground state, interference with the outside world cannot make it move to a lower state. If the energy of the outside world (that is, the "temperature of the bath") is kept lower than the energy gap between the ground state and the next higher energy state, the system has a proportionally lower probability of going to a higher energy state. Thus the system can stay in a single system eigenstate as long as needed.
Universality results in the adiabatic model are tied to quantum complexity and QMA-hard problems. The k-local Hamiltonian is QMA-complete for k ≥ 2. [8] QMA-hardness results are known for physically realistic lattice models of qubits such as [9]
where represent the Pauli matrices . Such models are used for universal adiabatic quantum computation. The Hamiltonians for the QMA-complete problem can also be restricted to act on a two dimensional grid of qubits [10] or a line of quantum particles with 12 states per particle. [11] If such models were found to be physically realizable, they too could be used to form the building blocks of a universal adiabatic quantum computer.
In practice, there are problems during a computation. As the Hamiltonian is gradually changed, the interesting parts (quantum behavior as opposed to classical) occur when multiple qubits are close to a tipping point. It is exactly at this point when the ground state (one set of qubit orientations) gets very close to a first energy state (a different arrangement of orientations). Adding a slight amount of energy (from the external bath, or as a result of slowly changing the Hamiltonian) could take the system out of the ground state, and ruin the calculation. Trying to perform the calculation more quickly increases the external energy; scaling the number of qubits makes the energy gap at the tipping points smaller.
Adiabatic quantum computation solves satisfiability problems and other combinatorial search problems. Specifically, these kind of problems seek a state that satisfies . This expression contains the satisfiability of M clauses, for which clause has the value True or False, and can involve n bits. Each bit is a variable such that is a Boolean value function of . QAA solves this kind of problem using quantum adiabatic evolution. It starts with an Initial Hamiltonian :
where shows the Hamiltonian corresponding to the clause . Usually, the choice of won't depend on different clauses, so only the total number of times each bit is involved in all clauses matters. Next, it goes through an adiabatic evolution, ending in the Problem Hamiltonian :
where is the satisfying Hamiltonian of clause C.
It has eigenvalues:
For a simple path of adiabatic evolution with run time T, consider:
and let . This results in:
, which is the adiabatic evolution Hamiltonian of the algorithm.
In accordance with the adiabatic theorem, start from the ground state of Hamiltonian at the beginning, proceed through an adiabatic process, and end in the ground state of problem Hamiltonian .
Then measure the z-component of each of the n spins in the final state. This will produce a string which is highly likely to be the result of the satisfiability problem. The run time T must be sufficiently long to assure correctness of the result. According to the adiabatic theorem, T is about , where is the minimum energy gap between ground state and first excited state. [12]
Adiabatic quantum computing is equivalent in power to standard gate-based quantum computing that implements arbitrary unitary operations. However, the mapping challenge on gate-based quantum devices differs substantially from quantum annealers as logical variables are mapped only to single qubits and not to chains. [13]
The D-Wave One is a device made by Canadian company D-Wave Systems, which claims that it uses quantum annealing to solve optimization problems. [14] [15] On 25 May 2011, Lockheed-Martin purchased a D-Wave One for about US$10 million. [15] In May 2013, Google purchased a 512 qubit D-Wave Two. [16]
The question of whether the D-Wave processors offer a speedup over a classical processor is still unanswered. Tests performed by researchers at Quantum Artificial Intelligence Lab (NASA), USC, ETH Zurich, and Google show that as of 2015, there is no evidence of a quantum advantage. [17] [18] [19]
Some of the authors are employees of D-Wave Systems Inc.
A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. In particular, a large-scale quantum computer could break widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.
Superconducting quantum computing is a branch of solid state quantum computing that implements superconducting electronic circuits using superconducting qubits as artificial atoms, or quantum dots. For superconducting qubits, the two logic states are the ground state and the excited state, denoted respectively. Research in superconducting quantum computing is conducted by companies such as Google, IBM, IMEC, BBN Technologies, Rigetti, and Intel. Many recently developed QPUs use superconducting architecture.
A trapped-ion quantum computer is one proposed approach to a large-scale quantum computer. Ions, or charged atomic particles, can be confined and suspended in free space using electromagnetic fields. Qubits are stored in stable electronic states of each ion, and quantum information can be transferred through the collective quantized motion of the ions in a shared trap. Lasers are applied to induce coupling between the qubit states or coupling between the internal qubit states and the external motional states.
Nuclear magnetic resonance quantum computing (NMRQC) is one of the several proposed approaches for constructing a quantum computer, that uses the spin states of nuclei within molecules as qubits. The quantum states are probed through the nuclear magnetic resonances, allowing the system to be implemented as a variation of nuclear magnetic resonance spectroscopy. NMR differs from other implementations of quantum computers in that it uses an ensemble of systems, in this case molecules, rather than a single pure state.
Quantum annealing (QA) is an optimization process for finding the global minimum of a given objective function over a given set of candidate solutions, by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete with many local minima; such as finding the ground state of a spin glass or solving the traveling salesman problem. The term "quantum annealing" was first proposed in 1988 by B. Apolloni, N. Cesa Bianchi and D. De Falco as a quantum-inspired classical algorithm. It was formulated in its present form by T. Kadowaki and H. Nishimori in 1998, though an imaginary-time variant without quantum coherence had been discussed by A. B. Finnila, M. A. Gomez, C. Sebenik and J. D. Doll in 1994.
D-Wave Quantum Systems Inc. is a quantum computing company with locations in Palo Alto, California and Burnaby, British Columbia. D-Wave claims to be the world's first company to sell computers that exploit quantum effects in their operation. D-Wave's early customers include Lockheed Martin, the University of Southern California, Google/NASA, and Los Alamos National Laboratory.
The one-way quantum computer, also known as measurement-based quantum computer (MBQC), is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.
The Landau–Zener formula is an analytic solution to the equations of motion governing the transition dynamics of a two-state quantum system, with a time-dependent Hamiltonian varying such that the energy separation of the two states is a linear function of time. The formula, giving the probability of a diabatic transition between the two energy states, was published separately by Lev Landau, Clarence Zener, Ernst Stueckelberg, and Ettore Majorana, in 1932.
In computational complexity theory, QMA, which stands for Quantum Merlin Arthur, is the set of languages for which, when a string is in the language, there is a polynomial-size quantum proof that convinces a polynomial time quantum verifier of this fact with high probability. Moreover, when the string is not in the language, every polynomial-size quantum state is rejected by the verifier with high probability.
The toric code is a topological quantum error correcting code, and an example of a stabilizer code, defined on a two-dimensional spin lattice. It is the simplest and most well studied of the quantum double models. It is also the simplest example of topological order—Z2 topological order (first studied in the context of Z2 spin liquid in 1991). The toric code can also be considered to be a Z2 lattice gauge theory in a particular limit. It was introduced by Alexei Kitaev.
Quantum simulators permit the study of a quantum system in a programmable fashion. In this instance, simulators are special purpose devices designed to provide insight about specific physics problems. Quantum simulators may be contrasted with generally programmable "digital" quantum computers, which would be capable of solving a wider class of quantum problems.
The Harrow–Hassidim–Lloyd algorithm or HHL algorithm is a quantum algorithm for numerically solving a system of linear equations, designed by Aram Harrow, Avinatan Hassidim, and Seth Lloyd. The algorithm estimates the result of a scalar measurement on the solution vector to a given linear system of equations.
Quantum machine learning is the integration of quantum algorithms within machine learning programs.
Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.
Hamiltonian simulation is a problem in quantum information science that attempts to find the computational complexity and quantum algorithms needed for simulating quantum systems. Hamiltonian simulation is a problem that demands algorithms which implement the evolution of a quantum state efficiently. The Hamiltonian simulation problem was proposed by Richard Feynman in 1982, where he proposed a quantum computer as a possible solution since the simulation of general Hamiltonians seem to grow exponentially with respect to the system size.
In quantum computing, the variational quantum eigensolver (VQE) is a quantum algorithm for quantum chemistry, quantum simulations and optimization problems. It is a hybrid algorithm that uses both classical computers and quantum computers to find the ground state of a given physical system. Given a guess or ansatz, the quantum processor calculates the expectation value of the system with respect to an observable, often the Hamiltonian, and a classical optimizer is used to improve the guess. The algorithm is based on the variational method of quantum mechanics.
In quantum information theory, the no low-energy trivial state (NLTS) conjecture is a precursor to a quantum PCP theorem (qPCP) and posits the existence of families of Hamiltonians with all low-energy states of non-trivial complexity. It was formulated by Michael Freedman and Matthew Hastings in 2013. An NLTS proof would be a consequence of one aspect of qPCP problems – the inability to certify an approximation of local Hamiltonians via NP completeness. In other words, an NLTS proof would be one consequence of the QMA complexity of qPCP problems. On a high level, if proved, NLTS would be one property of the non-Newtonian complexity of quantum computation. NLTS and qPCP conjectures posit the near-infinite complexity involved in predicting the outcome of quantum systems with many interacting states. These calculations of complexity would have implications for quantum computing such as the stability of entangled states at higher temperatures, and the occurrence of entanglement in natural systems. There is currently a proof of NLTS conjecture published in preprint.
This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.
Quantum computational chemistry is an emerging field that exploits quantum computing to simulate chemical systems. Despite quantum mechanics' foundational role in understanding chemical behaviors, traditional computational approaches face significant challenges, largely due to the complexity and computational intensity of quantum mechanical equations. This complexity arises from the exponential growth of a quantum system's wave function with each added particle, making exact simulations on classical computers inefficient.
A neutral atom quantum computer is a modality of quantum computers built out of Rydberg atoms; this modality has many commonalities with trapped-ion quantum computers. As of December 2023, the concept has been used to demonstrate a 48 logical qubit processor.