HPO formalism

Last updated

The history projection operator (HPO) formalism is an approach to temporal quantum logic developed by Chris Isham. It deals with the logical structure of quantum mechanical propositions asserted at different points in time.

Contents

Introduction

In standard quantum mechanics a physical system is associated with a Hilbert space . States of the system at a fixed time are represented by normalised vectors in the space and physical observables are represented by Hermitian operators on .

A physical proposition about the system at a fixed time can be represented by an orthogonal projection operator on (See quantum logic). This representation links together the lattice operations in the lattice of logical propositions and the lattice of projection operators on a Hilbert space (See quantum logic).

The HPO formalism is a natural extension of these ideas to propositions about the system that are concerned with more than one time.

History propositions

Homogeneous histories

A homogeneous history proposition is a sequence of single-time propositions specified at different times . These times are called the temporal support of the history. We shall denote the proposition as and read it as

" at time is true and then at time is true and then and then at time is true"

Inhomogeneous histories

Not all history propositions can be represented by a sequence of single-time propositions at different times. These are called inhomogeneous history propositions. An example is the proposition OR for two homogeneous histories .

History projection operators

The key observation of the HPO formalism is to represent history propositions by projection operators on a history Hilbert space. This is where the name "History Projection Operator" (HPO) comes from.

For a homogeneous history we can use the tensor product to define a projector

where is the projection operator on that represents the proposition at time .

This is a projection operator on the tensor product "history Hilbert space"

Not all projection operators on can be written as the sum of tensor products of the form . These other projection operators are used to represent inhomogeneous histories by applying lattice operations to homogeneous histories.

Temporal quantum logic

Representing history propositions by projectors on the history Hilbert space naturally encodes the logical structure of history propositions. The lattice operations on the set of projection operations on the history Hilbert space can be applied to model the lattice of logical operations on history propositions.

If two homogeneous histories and don't share the same temporal support they can be modified so that they do. If is in the temporal support of but not (for example) then a new homogeneous history proposition which differs from by including the "always true" proposition at each time can be formed. In this way the temporal supports of can always be joined. We shall therefore assume that all homogeneous histories share the same temporal support.

We now present the logical operations for homogeneous history propositions and such that

Conjunction (AND)

If and are two homogeneous histories then the history proposition " and " is also a homogeneous history. It is represented by the projection operator

Disjunction (OR)

If and are two homogeneous histories then the history proposition " or " is in general not a homogeneous history. It is represented by the projection operator

Negation (NOT)

The negation operation in the lattice of projection operators takes to

where is the identity operator on the Hilbert space. Thus the projector used to represent the proposition (i.e. "not ") is

Example: Two-time history

As an example, consider the negation of the two-time homogeneous history proposition . The projector to represent the proposition is

The terms which appear in this expression:

can each be interpreted as follows:

These three homogeneous histories, joined with the OR operation, include all the possibilities for how the proposition " and then " can be false. We therefore see that the definition of agrees with what the proposition should mean.

Related Research Articles

Bra–ket notation, also called Dirac notation, is a notation for linear algebra and linear operators on complex vector spaces together with their dual space both in the finite-dimensional and infinite-dimensional case. It is specifically designed to ease the types of calculations that frequently come up in quantum mechanics. Its use in quantum mechanics is quite widespread.

<span class="mw-page-title-main">Pauli matrices</span> Matrices important in quantum mechanics and the study of spin

In mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices which are Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma, they are occasionally denoted by tau when used in connection with isospin symmetries.

In mathematics, the adele ring of a global field is a central object of class field theory, a branch of algebraic number theory. It is the restricted product of all the completions of the global field and is an example of a self-dual topological ring.

<span class="mw-page-title-main">Second quantization</span> Formulation of the quantum many-body problem

Second quantization, also referred to as occupation number representation, is a formalism used to describe and analyze quantum many-body systems. In quantum field theory, it is known as canonical quantization, in which the fields are thought of as field operators, in a manner similar to how the physical quantities are thought of as operators in first quantization. The key ideas of this method were introduced in 1927 by Paul Dirac, and were later developed, most notably, by Pascual Jordan and Vladimir Fock. In this approach, the quantum many-body states are represented in the Fock state basis, which are constructed by filling up each single-particle state with a certain number of identical particles. The second quantization formalism introduces the creation and annihilation operators to construct and handle the Fock states, providing useful tools to the study of the quantum many-body theory.

In quantum information theory, a quantum channel is a communication channel which can transmit quantum information, as well as classical information. An example of quantum information is the state of a qubit. An example of classical information is a text document transmitted over the Internet.

In mathematics, Schubert calculus is a branch of algebraic geometry introduced in the nineteenth century by Hermann Schubert, in order to solve various counting problems of projective geometry. It was a precursor of several more modern theories, for example characteristic classes, and in particular its algorithmic aspects are still of current interest. The phrase "Schubert calculus" is sometimes used to mean the enumerative geometry of linear subspaces, roughly equivalent to describing the cohomology ring of Grassmannians, and sometimes used to mean the more general enumerative geometry of nonlinear varieties. Even more generally, "Schubert calculus" is often understood to encompass the study of analogous questions in generalized cohomology theories.

In mathematics, a holomorphic vector bundle is a complex vector bundle over a complex manifold X such that the total space E is a complex manifold and the projection map π : EX is holomorphic. Fundamental examples are the holomorphic tangent bundle of a complex manifold, and its dual, the holomorphic cotangent bundle. A holomorphic line bundle is a rank one holomorphic vector bundle.

A quasi-Hopf algebra is a generalization of a Hopf algebra, which was defined by the Russian mathematician Vladimir Drinfeld in 1989.

The theory of quantum error correction plays a prominent role in the practical realization and engineering of quantum computing and quantum communication devices. The first quantum error-correcting codes are strikingly similar to classical block codes in their operation and performance. Quantum error-correcting codes restore a noisy, decohered quantum state to a pure quantum state. A stabilizer quantum error-correcting code appends ancilla qubits to qubits that we want to protect. A unitary encoding circuit rotates the global state into a subspace of a larger Hilbert space. This highly entangled, encoded state corrects for local noisy errors. A quantum error-correcting code makes quantum computation and quantum communication practical by providing a way for a sender and receiver to simulate a noiseless qubit channel given a noisy qubit channel whose noise conforms to a particular error model.

In many-body theory, the term Green's function is sometimes used interchangeably with correlation function, but refers specifically to correlators of field operators or creation and annihilation operators.

<span class="mw-page-title-main">Optical phase space</span> Phase space used in quantum optics

In quantum optics, an optical phase space is a phase space in which all quantum states of an optical system are described. Each point in the optical phase space corresponds to a unique state of an optical system. For any such system, a plot of the quadratures against each other, possibly as functions of time, is called a phase diagram. If the quadratures are functions of time then the optical phase diagram can show the evolution of a quantum optical system with time.

A decoherence-free subspace (DFS) is a subspace of a quantum system's Hilbert space that is invariant to non-unitary dynamics. Alternatively stated, they are a small section of the system Hilbert space where the system is decoupled from the environment and thus its evolution is completely unitary. DFSs can also be characterized as a special class of quantum error correcting codes. In this representation they are passive error-preventing codes since these subspaces are encoded with information that (possibly) won't require any active stabilization methods. These subspaces prevent destructive environmental interactions by isolating quantum information. As such, they are an important subject in quantum computing, where (coherent) control of quantum systems is the desired goal. Decoherence creates problems in this regard by causing loss of coherence between the quantum states of a system and therefore the decay of their interference terms, thus leading to loss of information from the (open) quantum system to the surrounding environment. Since quantum computers cannot be isolated from their environment and information can be lost, the study of DFSs is important for the implementation of quantum computers into the real world.

<span class="mw-page-title-main">SIC-POVM</span>

In the context of quantum mechanics and quantum information theory, symmetric, informationally complete, positive operator-valued measures (SIC-POVMs) are a particular type of generalized measurement (POVM). SIC-POVMs are particularly notable thanks to their defining features of (1) being informationally complete; (2)having the minimal number of outcomes compatible with informational completeness, and (3) being highly symmetric. In this context, informational completeness is the property of a POVM of allowing to fully reconstruct input states from measurement data.

Given a Hilbert space with a tensor product structure a product numerical range is defined as a numerical range with respect to the subset of product vectors. In some situations, especially in the context of quantum mechanics product numerical range is known as local numerical range

In mathematics, the Kodaira–Spencer map, introduced by Kunihiko Kodaira and Donald C. Spencer, is a map associated to a deformation of a scheme or complex manifold X, taking a tangent space of a point of the deformation space to the first cohomology group of the sheaf of vector fields on X.

In machine learning, the kernel embedding of distributions comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard Schölkopf. A review of recent works on kernel embedding of distributions can be found in.

<span class="mw-page-title-main">Causal fermion systems</span> Candidate unified theory of physics

The theory of causal fermion systems is an approach to describe fundamental physics. It provides a unification of the weak, the strong and the electromagnetic forces with gravity at the level of classical field theory. Moreover, it gives quantum mechanics as a limiting case and has revealed close connections to quantum field theory. Therefore, it is a candidate for a unified physical theory. Instead of introducing physical objects on a preexisting spacetime manifold, the general concept is to derive spacetime as well as all the objects therein as secondary objects from the structures of an underlying causal fermion system. This concept also makes it possible to generalize notions of differential geometry to the non-smooth setting. In particular, one can describe situations when spacetime no longer has a manifold structure on the microscopic scale. As a result, the theory of causal fermion systems is a proposal for quantum geometry and an approach to quantum gravity.

Dynamic epistemic logic (DEL) is a logical framework dealing with knowledge and information change. Typically, DEL focuses on situations involving multiple agents and studies how their knowledge changes when events occur. These events can change factual properties of the actual world : for example a red card is painted in blue. They can also bring about changes of knowledge without changing factual properties of the world : for example a card is revealed publicly to be red. Originally, DEL focused on epistemic events. We only present in this entry some of the basic ideas of the original DEL framework; more details about DEL in general can be found in the references.

This article summarizes several identities in exterior calculus.

In mathematics, the injective tensor product of two topological vector spaces (TVSs) was introduced by Alexander Grothendieck and was used by him to define nuclear spaces. An injective tensor product is in general not necessarily complete, so its completion is called the completed injective tensor products. Injective tensor products have applications outside of nuclear spaces. In particular, as described below, up to TVS-isomorphism, many TVSs that are defined for real or complex valued functions, for instance, the Schwartz space or the space of continuously differentiable functions, can be immediately extended to functions valued in a Hausdorff locally convex TVS without any need to extend definitions from real/complex-valued functions to -valued functions.

References