This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
In quantum information theory, the no low-energy trivial state (NLTS) conjecture is a precursor to a quantum PCP theorem (qPCP) and posits the existence of families of Hamiltonians with all low-energy states of non-trivial complexity. [1] [2] [3] [4] It was formulated by Michael Freedman and Matthew Hastings in 2013. An NLTS proof would be a consequence of one aspect of qPCP problems – the inability to certify an approximation of local Hamiltonians via NP completeness. [2] In other words, an NLTS proof would be one consequence of the QMA complexity of qPCP problems. [5] On a high level, if proved, NLTS would be one property of the non-Newtonian complexity of quantum computation. [5] NLTS and qPCP conjectures posit the near-infinite complexity involved in predicting the outcome of quantum systems with many interacting states. [6] These calculations of complexity would have implications for quantum computing such as the stability of entangled states at higher temperatures, and the occurrence of entanglement in natural systems. [7] [6] There is currently a proof of NLTS conjecture published in preprint. [8]
The NLTS property is the underlying set of constraints that forms the basis for the NLTS conjecture.[ citation needed ]
A k-local Hamiltonian (quantum mechanics) is a Hermitian matrix acting on n qubits which can be represented as the sum of Hamiltonian terms acting upon at most qubits each:
The general k-local Hamiltonian problem is, given a k-local Hamiltonian , to find the smallest eigenvalue of . [9] is also called the ground-state energy of the Hamiltonian.
The family of local Hamiltonians thus arises out of the k-local problem. Kliesch states the following as a definition for local Hamiltonians in the context of NLTS: [2]
Let I ⊂ N be an index set. A family of local Hamiltonians is a set of Hamiltonians {H(n)}, n ∈ I, where each H(n) is defined on n finite-dimensional subsystems (in the following taken to be qubits), that are of the form
where each Hm(n) acts non-trivially on O(1) qubits. Another constraint is the operator norm of Hm(n) is bounded by a constant independent of n and each qubit is only involved in a constant number of terms Hm(n).
In physics, topological order [10] is a kind of order in the zero-temperature phase of matter (also known as quantum matter). In the context of NLTS, Kliesch states: "a family of local gapped Hamiltonians is called topologically ordered if any ground states cannot be prepared from a product state by a constant-depth circuit". [2]
Kliesch defines the NLTS property thus: [2]
Let I be an infinite set of system sizes. A family of local Hamiltonians {H(n)}, n ∈ I has the NLTS property if there exists ε > 0 and a function f : N → N such that
- for all n ∈ I, H(n) has ground energy 0,
- ⟨0n|U†H(n)U|0n⟩ > εn for any depth-d circuit U consisting of two qubit gates and for any n ∈ I with n ≥ f(d).
There exists a family of local Hamiltonians with the NLTS property. [2]
Proving the NLTS conjecture is an obstacle for resolving the qPCP conjecture, an even harder theorem to prove. [1] The qPCP conjecture is a quantum analogue of the classical PCP theorem. The classical PCP theorem states that satisfiability problems like 3SAT are NP-hard when estimating the maximal number of clauses that can be simultaneously satisfied in a hamiltonian system. [7] In layman's terms, classical PCP describes the near-infinite complexity involved in predicting the outcome of a system with many resolving states, such as a water bath full of hundreds of magnets. [6] qPCP increases the complexity by trying to solve PCP for quantum states. [6] Though it hasn't been proven yet, a positive proof of qPCP would imply that quantum entanglement in Gibbs states could remain stable at higher-energy states above absolute zero. [7]
NLTS on its own is difficult to prove, though a simpler no low-error trivial states (NLETS) theorem has been proven, and that proof is a precursor for NLTS. [11]
NLETS is defined as: [11]
In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theorem is an evolution of the 1970 no-go theorem authored by James Park, in which he demonstrates that a non-disturbing measurement scheme which is both simple and perfect cannot exist. The aforementioned theorems do not preclude the state of one system becoming entangled with the state of another as cloning specifically refers to the creation of a separable state with identical factors. For example, one might use the controlled NOT gate and the Walsh–Hadamard gate to entangle two qubits without violating the no-cloning theorem as no well-defined state may be defined in terms of a subsystem of an entangled state. The no-cloning theorem concerns only pure states whereas the generalized statement regarding mixed states is known as the no-broadcast theorem.
Quantum entanglement is the phenomenon of a group of particles being generated, interacting, or sharing spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between classical and quantum physics: entanglement is a primary feature of quantum mechanics not present in classical mechanics.
Quantum superposition is a fundamental principle of quantum mechanics that states that linear combinations of solutions to the Schrödinger equation are also solutions of the Schrödinger equation. This follows from the fact that the Schrödinger equation is a linear differential equation in time and position. More precisely, the state of a system is given by a linear combination of all the eigenfunctions of the Schrödinger equation governing that system.
In quantum information theory, a quantum circuit is a model for quantum computation, similar to classical circuits, in which a computation is a sequence of quantum gates, measurements, initializations of qubits to known values, and possibly other actions. The minimum set of actions that a circuit needs to be able to perform on the qubits to enable quantum computation is known as DiVincenzo's criteria.
A trapped-ion quantum computer is one proposed approach to a large-scale quantum computer. Ions, or charged atomic particles, can be confined and suspended in free space using electromagnetic fields. Qubits are stored in stable electronic states of each ion, and quantum information can be transferred through the collective quantized motion of the ions in a shared trap. Lasers are applied to induce coupling between the qubit states or coupling between the internal qubit states and the external motional states.
In computational complexity theory, the PCP theorem states that every decision problem in the NP complexity class has probabilistically checkable proofs of constant query complexity and logarithmic randomness complexity.
MAX-3SAT is a problem in the computational complexity subfield of computer science. It generalises the Boolean satisfiability problem (SAT) which is a decision problem considered in complexity theory. It is defined as:
The one-way or measurement-based quantum computer (MBQC) is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single qubit measurements on it. It is "one-way" because the resource state is destroyed by the measurements.
Adiabatic quantum computation (AQC) is a form of quantum computing which relies on the adiabatic theorem to perform calculations and is closely related to quantum annealing.
In quantum computing, the threshold theorem states that a quantum computer with a physical error rate below a certain threshold can, through application of quantum error correction schemes, suppress the logical error rate to arbitrarily low levels. This shows that quantum computers can be made fault-tolerant, as an analogue to von Neumann's threshold theorem for classical computation. This result was proven by the groups of Dorit Aharanov and Michael Ben-Or; Emanuel Knill, Raymond Laflamme, and Wojciech Zurek; and Alexei Kitaev independently. These results built on a paper of Peter Shor, which proved a weaker version of the threshold theorem.
Entanglement distillation is the transformation of N copies of an arbitrary entangled state into some number of approximately pure Bell pairs, using only local operations and classical communication.
In computational complexity theory, QMA, which stands for Quantum Merlin Arthur, is the set of languages for which, when a string is in the language, there is a polynomial-size quantum proof that convinces a polynomial time quantum verifier of this fact with high probability. Moreover, when the string is not in the language, every polynomial-size quantum state is rejected by the verifier with high probability.
The toric code is a topological quantum error correcting code, and an example of a stabilizer code, defined on a two-dimensional spin lattice. It is the simplest and most well studied of the quantum double models. It is also the simplest example of topological order—Z2 topological order (first studied in the context of Z2 spin liquid in 1991). The toric code can also be considered to be a Z2 lattice gauge theory in a particular limit. It was introduced by Alexei Kitaev.
Symmetry-protected topological (SPT) order is a kind of order in zero-temperature quantum-mechanical states of matter that have a symmetry and a finite energy gap.
In lattice field theory, the Nielsen–Ninomiya theorem is a no-go theorem about placing chiral fermions on a lattice. In particular, under very general assumptions such as locality, hermiticity, and translational symmetry, any lattice formulation of chiral fermions necessarily leads to fermion doubling, where there are the same number of left-handed and right-handed fermions. It was first proved by Holger Bech Nielsen and Masao Ninomiya in 1981 using two methods, one that relied on homotopy theory and another that relied on differential topology. Another proof provided by Daniel Friedan uses differential geometry. The theorem was also generalized to any regularization scheme of chiral theories. One consequence of the theorem is that the Standard Model cannot be put on a lattice. Common methods for overcoming the fermion doubling problem is to use modified fermion formulations such as staggered fermions, Wilson fermions, or Ginsparg–Wilson fermions, among others.
A fracton is an emergent topological quasiparticle excitation which is immobile when in isolation. Many theoretical systems have been proposed in which fractons exist as elementary excitations. Such systems are known as fracton models. Fractons have been identified in various CSS codes as well as in symmetric tensor gauge theories.
Hamiltonian simulation is a problem in quantum information science that attempts to find the computational complexity and quantum algorithms needed for simulating quantum systems. Hamiltonian simulation is a problem that demands algorithms which implement the evolution of a quantum state efficiently. The Hamiltonian simulation problem was proposed by Richard Feynman in 1982, where he proposed a quantum computer as a possible solution since the simulation of general Hamiltonians seem to grow exponentially with respect to the system size.
Hamiltonian complexity or quantum Hamiltonian complexity is a topic which deals with problems in quantum complexity theory and condensed matter physics. It mostly studies constraint satisfaction problems related to ground states of local Hamiltonians; that is, Hermitian matrices that act locally on a system of interest. The constraint satisfaction problems in quantum Hamiltonian complexity have led to the quantum version of the Cook–Levin theorem. Quantum Hamiltonian complexity has helped physicists understand the difficulty of simulating physical systems.
Quantum computational chemistry is an emerging field that exploits quantum computing to simulate chemical systems. Despite quantum mechanics' foundational role in understanding chemical behaviors, traditional computational approaches face significant challenges, largely due to the complexity and computational intensity of quantum mechanical equations. This complexity arises from the exponential growth of a quantum system's wave function with each added particle, making exact simulations on classical computers inefficient.
Nikolas P. Breuckmann is a German mathematical physicist affiliated with the University of Bristol, England. He is, as of Spring 2024, a visiting scientist and program organizer at the Simons Institute for the Theory of Computing at the University of California, Berkeley. His research focuses on quantum information theory, in particular quantum error correction and quantum complexity theory. He is known for his work on proving the NLTS conjecture, a famous open problem in quantum information theory.