This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these messages)
|
In quantum information theory, the no low-energy trivial state (NLTS) conjecture is a precursor to a quantum PCP theorem (qPCP) and posits the existence of families of Hamiltonians with all low-energy states of non-trivial complexity. [1] [2] [3] [4] It was formulated by Michael Freedman and Matthew Hastings in 2013. NLTS is a consequence of one aspect of qPCP problems – the inability to certify an approximation of local Hamiltonians via NP completeness. [2] In other words, it is a consequence of the QMA complexity of qPCP problems. [5] On a high level, it is one property of the non-Newtonian complexity of quantum computation. [5] NLTS and qPCP conjectures posit the near-infinite complexity involved in predicting the outcome of quantum systems with many interacting states. [6] These calculations of complexity would have implications for quantum computing such as the stability of entangled states at higher temperatures, and the occurrence of entanglement in natural systems. [7] [6] A proof of the NLTS conjecture was presented and published as part of STOC 2023. [8]
The NLTS property is the underlying set of constraints that forms the basis for the NLTS conjecture.[ citation needed ]
A k-local Hamiltonian (quantum mechanics) is a Hermitian matrix acting on n qubits which can be represented as the sum of Hamiltonian terms acting upon at most qubits each:
The general k-local Hamiltonian problem is, given a k-local Hamiltonian , to find the smallest eigenvalue of . [9] is also called the ground-state energy of the Hamiltonian.
The family of local Hamiltonians thus arises out of the k-local problem. Kliesch states the following as a definition for local Hamiltonians in the context of NLTS: [2]
Let I ⊂ N be an index set. A family of local Hamiltonians is a set of Hamiltonians {H(n)}, n ∈ I, where each H(n) is defined on n finite-dimensional subsystems (in the following taken to be qubits), that are of the form
where each Hm(n) acts non-trivially on O(1) qubits. Another constraint is the operator norm of Hm(n) is bounded by a constant independent of n and each qubit is only involved in a constant number of terms Hm(n).
In physics, topological order [10] is a kind of order in the zero-temperature phase of matter (also known as quantum matter). In the context of NLTS, Kliesch states: "a family of local gapped Hamiltonians is called topologically ordered if any ground states cannot be prepared from a product state by a constant-depth circuit". [2]
Kliesch defines the NLTS property thus: [2]
Let I be an infinite set of system sizes. A family of local Hamiltonians {H(n)}, n ∈ I has the NLTS property if there exists ε > 0 and a function f : N → N such that
- for all n ∈ I, H(n) has ground energy 0,
- ⟨0n|U†H(n)U|0n⟩ > εn for any depth-d circuit U consisting of two qubit gates and for any n ∈ I with n ≥ f(d).
There exists a family of local Hamiltonians with the NLTS property. [2]
Proving the NLTS conjecture is an obstacle for resolving the qPCP conjecture, an even harder theorem to prove. [1] The qPCP conjecture is a quantum analogue of the classical PCP theorem. The classical PCP theorem states that satisfiability problems like 3SAT are NP-hard when estimating the maximal number of clauses that can be simultaneously satisfied in a hamiltonian system. [7] In layman's terms, classical PCP describes the near-infinite complexity involved in predicting the outcome of a system with many resolving states, such as a water bath full of hundreds of magnets. [6] qPCP increases the complexity by trying to solve PCP for quantum states. [6] Though it hasn't been proven yet, a positive proof of qPCP would imply that quantum entanglement in Gibbs states could remain stable at higher-energy states above absolute zero. [7]
NLTS on its own is difficult to prove, though a simpler no low-error trivial states (NLETS) theorem has been proven, and that proof is a precursor for NLTS. [11]
NLETS is defined as: [11]
In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theorem is an evolution of the 1970 no-go theorem authored by James Park, in which he demonstrates that a non-disturbing measurement scheme which is both simple and perfect cannot exist. The aforementioned theorems do not preclude the state of one system becoming entangled with the state of another as cloning specifically refers to the creation of a separable state with identical factors. For example, one might use the controlled NOT gate and the Walsh–Hadamard gate to entangle two qubits without violating the no-cloning theorem as no well-defined state may be defined in terms of a subsystem of an entangled state. The no-cloning theorem concerns only pure states whereas the generalized statement regarding mixed states is known as the no-broadcast theorem. The no-cloning theorem has a time-reversed dual, the no-deleting theorem.
A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. Theoretically a large-scale quantum computer could break some widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
In quantum information theory, a quantum circuit is a model for quantum computation, similar to classical circuits, in which a computation is a sequence of quantum gates, measurements, initializations of qubits to known values, and possibly other actions. The minimum set of actions that a circuit needs to be able to perform on the qubits to enable quantum computation is known as DiVincenzo's criteria.
A trapped-ion quantum computer is one proposed approach to a large-scale quantum computer. Ions, or charged atomic particles, can be confined and suspended in free space using electromagnetic fields. Qubits are stored in stable electronic states of each ion, and quantum information can be transferred through the collective quantized motion of the ions in a shared trap. Lasers are applied to induce coupling between the qubit states or coupling between the internal qubit states and the external motional states.
In computational complexity theory, the PCP theorem states that every decision problem in the NP complexity class has probabilistically checkable proofs of constant query complexity and logarithmic randomness complexity.
MAX-3SAT is a problem in the computational complexity subfield of computer science. It generalises the Boolean satisfiability problem (SAT) which is a decision problem considered in complexity theory. It is defined as:
The one-way quantum computer, also known as measurement-based quantum computer (MBQC), is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.
Adiabatic quantum computation (AQC) is a form of quantum computing which relies on the adiabatic theorem to perform calculations and is closely related to quantum annealing.
In computational complexity theory, QMA, which stands for Quantum Merlin Arthur, is the set of languages for which, when a string is in the language, there is a polynomial-size quantum proof that convinces a polynomial time quantum verifier of this fact with high probability. Moreover, when the string is not in the language, every polynomial-size quantum state is rejected by the verifier with high probability.
The toric code is a topological quantum error correcting code, and an example of a stabilizer code, defined on a two-dimensional spin lattice. It is the simplest and most well studied of the quantum double models. It is also the simplest example of topological order—Z2 topological order (first studied in the context of Z2 spin liquid in 1991). The toric code can also be considered to be a Z2 lattice gauge theory in a particular limit. It was introduced by Alexei Kitaev.
Symmetry-protected topological (SPT) order is a kind of order in zero-temperature quantum-mechanical states of matter that have a symmetry and a finite energy gap.
A fracton is an emergent topological quasiparticle excitation which is immobile when in isolation. Many theoretical systems have been proposed in which fractons exist as elementary excitations. Such systems are known as fracton models. Fractons have been identified in various CSS codes as well as in symmetric tensor gauge theories.
Hamiltonian simulation is a problem in quantum information science that attempts to find the computational complexity and quantum algorithms needed for simulating quantum systems. Hamiltonian simulation is a problem that demands algorithms which implement the evolution of a quantum state efficiently. The Hamiltonian simulation problem was proposed by Richard Feynman in 1982, where he proposed a quantum computer as a possible solution since the simulation of general Hamiltonians seem to grow exponentially with respect to the system size.
Hamiltonian complexity or quantum Hamiltonian complexity is a topic which deals with problems in quantum complexity theory and condensed matter physics. It mostly studies constraint satisfaction problems related to ground states of local Hamiltonians; that is, Hermitian matrices that act locally on a system of interest. The constraint satisfaction problems in quantum Hamiltonian complexity have led to the quantum version of the Cook–Levin theorem. Quantum Hamiltonian complexity has helped physicists understand the difficulty of simulating physical systems.
This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields.
Quantum computational chemistry is an emerging field that exploits quantum computing to simulate chemical systems. Despite quantum mechanics' foundational role in understanding chemical behaviors, traditional computational approaches face significant challenges, largely due to the complexity and computational intensity of quantum mechanical equations. This complexity arises from the exponential growth of a quantum system's wave function with each added particle, making exact simulations on classical computers inefficient.
Nikolas P. Breuckmann is a German mathematical physicist affiliated with the University of Bristol, England. He is, as of Spring 2024, a visiting scientist and program organizer at the Simons Institute for the Theory of Computing at the University of California, Berkeley. His research focuses on quantum information theory, in particular quantum error correction and quantum complexity theory. He is known for his work on proving the NLTS conjecture, a famous open problem in quantum information theory.
The quantum metrological gain is defined in the context of carrying out a metrological task using a quantum state of a multiparticle system. It is the sensitivity of parameter estimation using the state compared to what can be reached using separable states, i.e., states without quantum entanglement. Hence, the quantum metrological gain is given as the fraction of the sensitivity achieved by the state and the maximal sensitivity achieved by separable states. The best separable state is often the trivial fully polarized state, in which all spins point into the same direction. If the metrological gain is larger than one then the quantum state is more useful for making precise measurements than separable states. Clearly, in this case the quantum state is also entangled.
In condensed matter physics, the Kitaev chain is a simplified model for a topological superconductor. It models a one dimensional lattice featuring Majorana bound states. The Kitaev chain have been used as a toy model of semiconductor nanowires for the development of topological quantum computers. The model was first proposed by Alexei Kitaev in 2000.