Cross-entropy benchmarking

Last updated

Cross-entropy benchmarking (also referred to as XEB) is a quantum benchmarking protocol which can be used to demonstrate quantum supremacy. [1] In XEB, a random quantum circuit is executed on a quantum computer multiple times in order to collect a set of samples in the form of bitstrings . The bitstrings are then used to calculate the cross-entropy benchmark fidelity () via a classical computer, given by

,

where is the number of qubits in the circuit and is the probability of a bitstring for an ideal quantum circuit . If , the samples were collected from a noiseless quantum computer. If , then the samples could have been obtained via random guessing. [2] This means that if a quantum computer did generate those samples, then the quantum computer is too noisy and thus has no chance of performing beyond-classical computations. Since it takes an exponential amount of resources to classically simulate a quantum circuit, there comes a point when the biggest supercomputer that runs the best classical algorithm for simulating quantum circuits can't compute the XEB. Crossing this point is known as achieving quantum supremacy; and after entering the quantum supremacy regime, XEB can only be estimated. [3]

The Sycamore processor was the first to demonstrate quantum supremacy via XEB. Instances of random circuits with and 20 cycles were run to obtain an XEB of . [3] Generating samples took 200 seconds on the quantum processor when it would have taken 10,000 years on Summit at the time of the experiment. Improvements in classical algorithms have shortened the runtime to about a week on Sunway TaihuLight thus collapsing Sycamore's claim to quantum supremacy. [4] As of 2021, the latest demonstration of quantum supremacy by Zuchongzhi 2.1 with , 24 cycles and an XEB of holds. It takes around 4 hours to generate samples on Zuchongzhi 2.1 when it would take 10,000 years on Sunway. [4]

See also

Related Research Articles

<span class="mw-page-title-main">Quantum computing</span> Computer hardware technology that uses quantum mechanics

A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. Theoretically a large-scale quantum computer could break some widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.

<span class="mw-page-title-main">Quantum teleportation</span> Physical phenomenon

Quantum teleportation is a technique for transferring quantum information from a sender at one location to a receiver some distance away. While teleportation is commonly portrayed in science fiction as a means to transfer physical objects from one location to the next, quantum teleportation only transfers quantum information. The sender does not have to know the particular quantum state being transferred. Moreover, the location of the recipient can be unknown, but to complete the quantum teleportation, classical information needs to be sent from sender to receiver. Because classical information needs to be sent, quantum teleportation cannot occur faster than the speed of light.

Shor's algorithm is a quantum algorithm for finding the prime factors of an integer. It was developed in 1994 by the American mathematician Peter Shor. It is one of the few known quantum algorithms with compelling potential applications and strong evidence of superpolynomial speedup compared to best known classical (non-quantum) algorithms. On the other hand, factoring numbers of practical significance requires far more qubits than available in the near future. Another concern is that noise in quantum circuits may undermine results, requiring additional qubits for quantum error correction.

In quantum computing, Grover's algorithm, also known as the quantum search algorithm, is a quantum algorithm for unstructured search that finds with high probability the unique input to a black box function that produces a particular output value, using just evaluations of the function, where is the size of the function's domain. It was devised by Lov Grover in 1996.

In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.

<span class="mw-page-title-main">Quantum circuit</span> Model of quantum computing

In quantum information theory, a quantum circuit is a model for quantum computation, similar to classical circuits, in which a computation is a sequence of quantum gates, measurements, initializations of qubits to known values, and possibly other actions. The minimum set of actions that a circuit needs to be able to perform on the qubits to enable quantum computation is known as DiVincenzo's criteria.

Quantum error correction (QEC) is a set of techniques used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is theorised as essential to achieve fault tolerant quantum computing that can reduce the effects of noise on stored quantum information, faulty quantum gates, faulty quantum state preparation, and faulty measurements. Effective quantum error correction would allow quantum computers with low qubit fidelity to execute algorithms of higher complexity or greater circuit depth.

In quantum computing, a graph state is a special type of multi-qubit state that can be represented by a graph. Each qubit is represented by a vertex of the graph, and there is an edge between every interacting pair of qubits. In particular, they are a convenient way of representing certain types of entangled states.

In physics, a quantum amplifier is an amplifier that uses quantum mechanical methods to amplify a signal; examples include the active elements of lasers and optical amplifiers.

<span class="mw-page-title-main">One-way quantum computer</span> Method of quantum computing

The one-way quantum computer, also known as measurement-based quantum computer (MBQC), is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.

In quantum information and quantum computing, a cluster state is a type of highly entangled state of multiple qubits. Cluster states are generated in lattices of qubits with Ising type interactions. A cluster C is a connected subset of a d-dimensional lattice, and a cluster state is a pure state of the qubits located on C. They are different from other types of entangled states such as GHZ states or W states in that it is more difficult to eliminate quantum entanglement in the case of cluster states. Another way of thinking of cluster states is as a particular instance of graph states, where the underlying graph is a connected subset of a d-dimensional lattice. Cluster states are especially useful in the context of the one-way quantum computer. For a comprehensible introduction to the topic see.

<span class="mw-page-title-main">Mutually unbiased bases</span>

In quantum information theory, a set of bases in Hilbert space Cd are said to be mutually unbiased if when a system is prepared in an eigenstate of one of the bases, then all outcomes of the measurement with respect to the other basis are predicted to occur with an equal probability inexorably equal to 1/d.

In quantum computing, the quantum Fourier transform (QFT) is a linear transformation on quantum bits, and is the quantum analogue of the discrete Fourier transform. The quantum Fourier transform is a part of many quantum algorithms, notably Shor's algorithm for factoring and computing the discrete logarithm, the quantum phase estimation algorithm for estimating the eigenvalues of a unitary operator, and algorithms for the hidden subgroup problem. The quantum Fourier transform was discovered by Don Coppersmith. With small modifications to the QFT, it can also be used for performing fast integer arithmetic operations such as addition and multiplication.

Linear optical quantum computing or linear optics quantum computation (LOQC), also photonic quantum computing (PQC), is a paradigm of quantum computation, allowing (under certain conditions, described below) universal quantum computation. LOQC uses photons as information carriers, mainly uses linear optical elements, or optical instruments (including reciprocal mirrors and waveplates) to process quantum information, and uses photon detectors and quantum memories to detect and store quantum information.

The Harrow–Hassidim–Lloyd algorithm or HHL algorithm is a quantum algorithm for numerically solving a system of linear equations, designed by Aram Harrow, Avinatan Hassidim, and Seth Lloyd. The algorithm estimates the result of a scalar measurement on the solution vector to a given linear system of equations.

Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.

The One Clean Qubit model of computation is performed an qubit system with one pure state and maximally mixed states. This model was motivated by highly mixed states that are prevalent in Nuclear magnetic resonance quantum computers. It's described by the density matrix , where is the identity matrix. In computational complexity theory, DQC1; also known as the Deterministic quantum computation with one clean qubit is the class of decision problems solvable by a one clean qubit machine in polynomial time, upon measuring the first qubit, with an error probability of at most 1/poly(n) for all instances.

Feynman's algorithm is an algorithm that is used to simulate the operations of a quantum computer on a classical computer. It is based on the Path integral formulation of quantum mechanics, which was formulated by Richard Feynman.

This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.

Quantum random circuits (QRC) is a concept of incorporating an element of randomness into the local unitary operations and measurements of a quantum circuit. The idea is similar to that of random matrix theory which is to use the QRC to obtain almost exact results of non-integrable, hard-to-solve problems by averaging over an ensemble of outcomes. This incorporation of randomness into the circuits has many possible advantages, some of which are (i) the validation of quantum computers, which is the method that Google used when they claimed quantum supremacy in 2019., and (ii) understanding the universal structure of non-equilibrium and thermalization processes in quantum many-body dynamics.

References

  1. Boixo, S.; et al. (2018). "Characterizing Quantum Supremacy in Near-Term Devices". Nature Physics. 14 (6): 595–600. arXiv: 1608.00263 . Bibcode:2018NatPh..14..595B. doi:10.1038/s41567-018-0124-x. S2CID   4167494.
  2. Aaronson, S. (2021). "Open Problems Related to Quantum Query Complexity". arXiv: 2109.06917 [quant-ph].
  3. 1 2 Arute, F.; et al. (2019). "Quantum supremacy using a programmable superconducting processor". Nature. 574 (7779): 505–510. arXiv: 1910.11333 . Bibcode:2019Natur.574..505A. doi:10.1038/s41586-019-1666-5. PMID   31645734. S2CID   204836822.
  4. 1 2 Liu, X.; et al. (2021). "Redefining the Quantum Supremacy Baseline With a New Generation Sunway Supercomputer". arXiv: 2111.01066 [quant-ph].