Quantum random circuits (QRC) is a concept of incorporating an element of randomness into the local unitary operations and measurements of a quantum circuit. The idea is similar to that of random matrix theory which is to use the QRC to obtain almost exact results of non-integrable, hard-to-solve problems by averaging over an ensemble of outcomes. This incorporation of randomness into the circuits has many possible advantages, some of which are (i) the validation of quantum computers, which is the method that Google used when they claimed quantum supremacy in 2019, [1] and (ii) understanding the universal structure of non-equilibrium and thermalization processes in quantum many-body dynamics. [2]
The constituents of some general quantum circuits would be qubits, unitary gates, and measurements. The time evolution of the quantum circuits is discrete in time , and the states are evolved step by step in time by the application of unitary operators under which a pure state evolves according to(note that unitary operators can entangle states). Thus, the time evolution from a starting time, say , to some time would be given bywhere for each step, the unitary operator is represented by a tensor product of local unitary gates where the index specifies the lattice integer which connects a pair of qubits, and is the time step.
Figure 1, shows a time-space diagram of a quantum circuit which shows the local interactions at each time step. In the language of quantum information theory, the number of qubits is the circuit's width, and we define its depth as the number of layers of unitary gates. Hence, for the configuration in Figure 1, and . Another way to interpret the circuit is to look at it as a tensor network in which each purple box is a local gate operating on two qubits and the total contraction of qubits indices at the start and the end at time on the lattice integers would give the full unitary time evolution . Thus, the propagation amplitude from some initial state given by the indices to a final state with the indices isOn the other side, measurements would disentangle the qubits. [3] The used measurements are called projective measurements, defined as observations that leave the degrees of freedom in an eigenstate of the measured operator unchanged.
Measurements in quantum mechanics are stochastic by nature, which means that circuits with the same exact structure (qubits and gates) would give different outcomes on different runs, see Figure 2. Though this stochastic nature, should be differentiated from randomness. Let be the outcome set of some random measurement, then different measurements on a fixed set of unitary gates would yield distinct records. See the schematic diagram in Figure 2, which sketches a tree diagram with each branch representing a possible outcome of the measurements shown on the circuit. Notice that each measurement results in a different , which would be kind of like a random walk. If our system is just a single qubit, then each measurement causes a jump on the Bloch sphere. However, in the many-body case, the situation is complicated due to correlations between different qubits. [3] [4]
As we are currently in the Noisy Intermediate-Scale Quantum (NISQ) era, which means that our current quantum computers are not fault tolerant and are not large enough to reach supremacy, we are looking for tasks that have two features:
The needed tasks must be feasible on a quantum computer but classically resource-consuming in terms of, for example, time. For instance, this task could be a system that is solvable in a short time using a classical computer; however, as the system's complexity increases (larger size or dimensions), the computation time would not increase linearly. In that case, a state-of-the-art classical computer would take an unreasonable amount of time (years); meanwhile, a quantum computer is believed to give an exponential reduction in the needed time of computation. [5] Research on this subject to find such a task focused on sampling problems. One of the theoretically compelling methods that would provide such a task is Boson Sampling, as it shows strong complexity-theoretic evidence. [6] However, researchers faced experimental difficulties in achieving the desired results using this sampling method. [5] Another method is random circuit sampling, in which the main task is to sample the output of a random quantum circuit. Results have shown that this approach would be more experimentally feasible with the recent developments of superconducting qubits and has strong complexity-theoretic evidence. [5] In Google's claim of quantum supremacy, they have used their sycamore processor, which took about 200 seconds to sample one instance of a quantum circuit a million times. While on the other hand, a state-of-the-art classical supercomputer would take 10,000 years. IBM challenged Google's claim, showing that a better classical algorithm could do the same calculation in 2.5 days. [7]
One of the pressing questions in many-body dynamics is how entanglement spreads with time through for example a quantum quench that is an initially prepared system evolves unitarily in time by a sudden change in the parameters of the initial Hamiltonian. [8] The answer to such a question for a fundamental part of thermalization and would provide a numerical tool to simulate quantum dynamics. Quantum random circuits would serve as a playground to experiment on and understand such processes. [2] Results using QRC methods have shown that there is a universal structure behind noisy entanglement growth [2] [9]
In computational complexity theory, bounded-error quantum polynomial time (BQP) is the class of decision problems solvable by a quantum computer in polynomial time, with an error probability of at most 1/3 for all instances. It is the quantum analogue to the complexity class BPP.
In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theorem is an evolution of the 1970 no-go theorem authored by James Park, in which he demonstrates that a non-disturbing measurement scheme which is both simple and perfect cannot exist. The aforementioned theorems do not preclude the state of one system becoming entangled with the state of another as cloning specifically refers to the creation of a separable state with identical factors. For example, one might use the controlled NOT gate and the Walsh–Hadamard gate to entangle two qubits without violating the no-cloning theorem as no well-defined state may be defined in terms of a subsystem of an entangled state. The no-cloning theorem concerns only pure states whereas the generalized statement regarding mixed states is known as the no-broadcast theorem. The no-cloning theorem has a time-reversed dual, the no-deleting theorem.
A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. Theoretically a large-scale quantum computer could break some widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.
Shor's algorithm is a quantum algorithm for finding the prime factors of an integer. It was developed in 1994 by the American mathematician Peter Shor. It is one of the few known quantum algorithms with compelling potential applications and strong evidence of superpolynomial speedup compared to best known classical (non-quantum) algorithms. On the other hand, factoring numbers of practical significance requires far more qubits than available in the near future. Another concern is that noise in quantum circuits may undermine results, requiring additional qubits for quantum error correction.
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
In quantum information theory, a quantum circuit is a model for quantum computation, similar to classical circuits, in which a computation is a sequence of quantum gates, measurements, initializations of qubits to known values, and possibly other actions. The minimum set of actions that a circuit needs to be able to perform on the qubits to enable quantum computation is known as DiVincenzo's criteria.
In quantum computing and specifically the quantum circuit model of computation, a quantum logic gate is a basic quantum circuit operating on a small number of qubits. Quantum logic gates are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.
Quantum error correction (QEC) is a set of techniques used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is theorised as essential to achieve fault tolerant quantum computing that can reduce the effects of noise on stored quantum information, faulty quantum gates, faulty quantum state preparation, and faulty measurements. Effective quantum error correction would allow quantum computers with low qubit fidelity to execute algorithms of higher complexity or greater circuit depth.
Superconducting quantum computing is a branch of solid state physics and quantum computing that implements superconducting electronic circuits using superconducting qubits as artificial atoms, or quantum dots. For superconducting qubits, the two logic states are the ground state and the excited state, denoted respectively. Research in superconducting quantum computing is conducted by companies such as Google, IBM, IMEC, BBN Technologies, Rigetti, and Intel. Many recently developed QPUs use superconducting architecture.
Quantum neural networks are computational neural network models which are based on the principles of quantum mechanics. The first ideas on quantum neural computation were published independently in 1995 by Subhash Kak and Ron Chrisley, engaging with the theory of quantum mind, which posits that quantum effects play a role in cognitive function. However, typical research in quantum neural networks involves combining classical artificial neural network models with the advantages of quantum information in order to develop more efficient algorithms. One important motivation for these investigations is the difficulty to train classical neural networks, especially in big data applications. The hope is that features of quantum computing such as quantum parallelism or the effects of interference and entanglement can be used as resources. Since the technological implementation of a quantum computer is still in a premature stage, such quantum neural network models are mostly theoretical proposals that await their full implementation in physical experiments.
The one-way quantum computer, also known as measurement-based quantum computer (MBQC), is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.
Entanglement distillation is the transformation of N copies of an arbitrary entangled state into some number of approximately pure Bell pairs, using only local operations and classical communication. Entanglement distillation can overcome the degenerative influence of noisy quantum channels by transforming previously shared, less-entangled pairs into a smaller number of maximally-entangled pairs.
Quantum block codes are useful in quantum computing and in quantum communications. The encoding circuit for a large block code typically has a high complexity although those for modern codes do have lower complexity.
Linear optical quantum computing or linear optics quantum computation (LOQC), also photonic quantum computing (PQC), is a paradigm of quantum computation, allowing (under certain conditions, described below) universal quantum computation. LOQC uses photons as information carriers, mainly uses linear optical elements, or optical instruments (including reciprocal mirrors and waveplates) to process quantum information, and uses photon detectors and quantum memories to detect and store quantum information.
Quantum machine learning is the integration of quantum algorithms within machine learning programs.
Boson sampling is a restricted model of non-universal quantum computation introduced by Scott Aaronson and Alex Arkhipov after the original work of Lidror Troyansky and Naftali Tishby, that explored possible usage of boson scattering to evaluate expectation values of permanents of matrices. The model consists of sampling from the probability distribution of identical bosons scattered by a linear interferometer. Although the problem is well defined for any bosonic particles, its photonic version is currently considered as the most promising platform for a scalable implementation of a boson sampling device, which makes it a non-universal approach to linear optical quantum computing. Moreover, while not universal, the boson sampling scheme is strongly believed to implement computing tasks which are hard to implement with classical computers by using far fewer physical resources than a full linear-optical quantum computing setup. This advantage makes it an ideal candidate for demonstrating the power of quantum computation in the near term.
Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.
In quantum computing, quantum supremacy or quantum advantage is the goal of demonstrating that a programmable quantum computer can solve a problem that no classical computer can solve in any feasible amount of time, irrespective of the usefulness of the problem. The term was coined by John Preskill in 2011, but the concept dates to Yuri Manin's 1980 and Richard Feynman's 1981 proposals of quantum computing.
Cross-entropy benchmarking is a quantum benchmarking protocol which can be used to demonstrate quantum supremacy. In XEB, a random quantum circuit is executed on a quantum computer multiple times in order to collect a set of samples in the form of bitstrings . The bitstrings are then used to calculate the cross-entropy benchmark fidelity via a classical computer, given by
This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.