Cross-entropy benchmarking

Last updated

Cross-entropy benchmarking (also referred to as XEB) is a quantum benchmarking protocol which can be used to demonstrate quantum supremacy. [1] In XEB, a random quantum circuit is executed on a quantum computer multiple times in order to collect a set of samples in the form of bitstrings . The bitstrings are then used to calculate the cross-entropy benchmark fidelity () via a classical computer, given by

,

where is the number of qubits in the circuit and is the probability of a bitstring for an ideal quantum circuit . If , the samples were collected from a noiseless quantum computer. If , then the samples could have been obtained via random guessing. [2] This means that if a quantum computer did generate those samples, then the quantum computer is too noisy and thus has no chance of performing beyond-classical computations. Since it takes an exponential amount of resources to classically simulate a quantum circuit, there comes a point when the biggest supercomputer that runs the best classical algorithm for simulating quantum circuits can't compute the XEB. Crossing this point is known as achieving quantum supremacy; and after entering the quantum supremacy regime, XEB can only be estimated. [3]

The Sycamore processor was the first to demonstrate quantum supremacy via XEB. Instances of random circuits with and 20 cycles were run to obtain an XEB of . [3] Generating samples took 200 seconds on the quantum processor when it would have taken 10,000 years on Summit at the time of the experiment. Improvements in classical algorithms have shortened the runtime to about a week on Sunway TaihuLight thus collapsing Sycamore's claim to quantum supremacy. [4] As of 2021, the latest demonstration of quantum supremacy by Zuchongzhi 2.1 with , 24 cycles and an XEB of holds. It takes around 4 hours to generate samples on Zuchonzhi 2.1 when it would take 10,000 years on Sunway. [4]

See also

Related Research Articles

<span class="mw-page-title-main">Quantum computing</span> Technology that uses quantum mechanics

A quantum computer is a computer that takes advantage of quantum mechanical phenomena.

Shor's algorithm is a quantum algorithm for finding the prime factors of an integer. It was developed in 1994 by the American mathematician Peter Shor. It is one of the few known quantum algorithms with compelling potential applications and strong evidence of superpolynomial speedup compared to best known classical algorithms. On the other hand, factoring numbers of practical significance requires far more qubits than available in the near future. Another concern is that noise in quantum circuits may undermine results, requiring additional qubits for quantum error correction.

In quantum computing, Grover's algorithm, also known as the quantum search algorithm, is a quantum algorithm for unstructured search that finds with high probability the unique input to a black box function that produces a particular output value, using just evaluations of the function, where is the size of the function's domain. It was devised by Lov Grover in 1996.

In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "local hidden variables" within quantum systems.

<span class="mw-page-title-main">Quantum circuit</span> Model of quantum computing

In quantum information theory, a quantum circuit is a model for quantum computation, similar to classical circuits, in which a computation is a sequence of quantum gates, measurements, initializations of qubits to known values, and possibly other actions. The minimum set of actions that a circuit needs to be able to perform on the qubits to enable quantum computation is known as DiVincenzo's criteria.

Quantum error correction (QEC) is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is theorised as essential to achieve fault tolerant quantum computing that can reduce the effects of noise on stored quantum information, faulty quantum gates, faulty quantum preparation, and faulty measurements. This would allow algorithms of greater circuit depth.

In physics, a quantum amplifier is an amplifier that uses quantum mechanical methods to amplify a signal; examples include the active elements of lasers and optical amplifiers.

<span class="mw-page-title-main">One-way quantum computer</span> Method of quantum computing

The one-way or measurement-based quantum computer (MBQC) is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.

A quantum t-design is a probability distribution over either pure quantum states or unitary operators which can duplicate properties of the probability distribution over the Haar measure for polynomials of degree t or less. Specifically, the average of any polynomial function of degree t over the design is exactly the same as the average over Haar measure. Here the Haar measure is a uniform probability distribution over all quantum states or over all unitary operators. Quantum t-designs are so called because they are analogous to t-designs in classical statistics, which arose historically in connection with the problem of design of experiments. Two particularly important types of t-designs in quantum mechanics are projective and unitary t-designs.

Entanglement distillation is the transformation of N copies of an arbitrary entangled state into some number of approximately pure Bell pairs, using only local operations and classical communication.

Linear optical quantum computing or linear optics quantum computation (LOQC), also photonic quantum computing (PQC), is a paradigm of quantum computation, allowing (under certain conditions, described below) universal quantum computation. LOQC uses photons as information carriers, mainly uses linear optical elements, or optical instruments (including reciprocal mirrors and waveplates) to process quantum information, and uses photon detectors and quantum memories to detect and store quantum information.

The Harrow–Hassidim–Lloyd algorithm or HHL algorithm is a quantum algorithm for numerically solving a system of linear equations, designed by Aram Harrow, Avinatan Hassidim, and Seth Lloyd. The algorithm estimates the result of a scalar measurement on the solution vector to a given linear system of equations.

Boson sampling is a restricted model of non-universal quantum computation introduced by Scott Aaronson and Alex Arkhipov after the original work of Lidror Troyansky and Naftali Tishby, that explored possible usage of boson scattering to evaluate expectation values of permanents of matrices. The model consists of sampling from the probability distribution of identical bosons scattered by a linear interferometer. Although the problem is well defined for any bosonic particles, its photonic version is currently considered as the most promising platform for a scalable implementation of a boson sampling device, which makes it a non-universal approach to linear optical quantum computing. Moreover, while not universal, the boson sampling scheme is strongly believed to implement computing tasks which are hard to implement with classical computers by using far fewer physical resources than a full linear-optical quantum computing setup. This advantage makes it an ideal candidate for demonstrating the power of quantum computation in the near term.

Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.

In quantum computing, quantum supremacy or quantum advantage is the goal of demonstrating that a programmable quantum computer can solve a problem that no classical computer can solve in any feasible amount of time, irrespective of the usefulness of the problem. The term was coined by John Preskill in 2012, but the concept dates back to Yuri Manin's 1980 and Richard Feynman's 1981 proposals of quantum computing.

The One Clean Qubit model of computation is performed an qubit system with one pure state and maximally mixed states. This model was motivated by highly mixed states that are prevalent in Nuclear magnetic resonance quantum computers. It's described by the density matrix , where I is the identity matrix. In computational complexity theory, DQC1; also known as the Deterministic quantum computation with one clean qubit is the class of decision problems solvable by a one clean qubit machine in polynomial time, upon measuring the first qubit, with an error probability of at most 1/poly(n) for all instances.

Feynman's algorithm is an algorithm that is used to simulate the operations of a quantum computer on a classical computer. It is based on the Path integral formulation of quantum mechanics, which was formulated by Richard Feynman.

The five-qubit error correcting code is the smallest quantum error correcting code that can protect a logical qubit from any arbitrary single qubit error. In this code, 5 physical qubits are used to encode the logical qubit. With and being Pauli matrices and the Identity matrix, this code's generators are . Its logical operators are and . Once the logical qubit is encoded, errors on the physical qubits can be detected via stabilizer measurements. A lookup table that maps the results of the stabilizer measurements to the types and locations of the errors gives the control system of the quantum computer enough information to correct errors.

This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.

Quantum random circuits (QRC) is a concept of incorporating an element of randomness into the local unitary operations and measurements of a quantum circuit. The idea is similar to that of random matrix theory which is to use the QRC to obtain almost exact results of non-integrable, hard-to-solve problems by averaging over an ensemble of outcomes. This incorporation of randomness into the circuits has many possible advantages, some of which are (i) the validation of quantum computers, which is the method that Google used when they claimed quantum supremacy in 2019., and (ii) understanding the universal structure of non-equilibrium and thermalization processes in quantum many-body dynamics.

References

  1. Boixo, S.; et al. (2018). "Characterizing Quantum Supremacy in Near-Term Devices". Nature Physics. 14 (6): 595–600. arXiv: 1608.00263 . Bibcode:2018NatPh..14..595B. doi:10.1038/s41567-018-0124-x. S2CID   4167494.
  2. Aaronson, S. (2021). "Open Problems Related to Quantum Query Complexity". arXiv: 2109.06917 [quant-ph].
  3. 1 2 Arute, F.; et al. (2019). "Quantum supremacy using a programmable superconducting processor". Nature. 574 (7779): 505–510. arXiv: 1910.11333 . Bibcode:2019Natur.574..505A. doi:10.1038/s41586-019-1666-5. PMID   31645734. S2CID   204836822.
  4. 1 2 Liu, X.; et al. (2021). "Redefining the Quantum Supremacy Baseline With a New Generation Sunway Supercomputer". arXiv: 2111.01066 [quant-ph].