Reflexive operator algebra

Last updated

In functional analysis, a reflexive operator algebra A is an operator algebra that has enough invariant subspaces to characterize it. Formally, A is reflexive if it is equal to the algebra of bounded operators which leave invariant each subspace left invariant by every operator in A.

Functional analysis branch of mathematical analysis concerned with infinite-dimensional topological vector spaces, often spaces of functions

Functional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure and the linear functions defined on these spaces and respecting these structures in a suitable sense. The historical roots of functional analysis lie in the study of spaces of functions and the formulation of properties of transformations of functions such as the Fourier transform as transformations defining continuous, unitary etc. operators between function spaces. This point of view turned out to be particularly useful for the study of differential and integral equations.

In functional analysis, an operator algebra is an algebra of continuous linear operators on a topological vector space with the multiplication given by the composition of mappings.

In mathematics, an invariant subspace of a linear mapping T : VV from some vector space V to itself is a subspace W of V that is preserved by T; that is, T(W) ⊆ W.

Contents

This should not be confused with a reflexive space.

In the area of mathematics known as functional analysis, a reflexive space is a Banach space that coincides with the continuous dual of its continuous dual space, both as linear space and as topological space. Reflexive Banach spaces are often characterized by their geometric properties.

Examples

Nest algebras are examples of reflexive operator algebras. In finite dimensions, these are simply algebras of all matrices of a given size whose nonzero entries lie in an upper-triangular pattern.

In functional analysis, a branch of mathematics, nest algebras are a class of operator algebras that generalise the upper-triangular matrix algebras to a Hilbert space context. They were introduced by Ringrose (1965) and have many interesting properties. They are non-selfadjoint algebras, are closed in the weak operator topology and are reflexive.

In fact if we fix any pattern of entries in an n by n matrix containing the diagonal, then the set of all n by n matrices whose nonzero entries lie in this pattern forms a reflexive algebra.

An example of an algebra which is not reflexive is the set of 2 by 2 matrices

This algebra is smaller than the Nest algebra

but has the same invariant subspaces, so it is not reflexive.

If T is a fixed n by n matrix then the set of all polynomials in T and the identity operator forms a unital operator algebra. A theorem of Deddens and Fillmore states that this algebra is reflexive if and only if the largest two blocks in the Jordan normal form of T differ in size by at most one. For example, the algebra

Jordan normal form Form of a matrix indicating its eigenvalues and their algebraic multiplicities

In linear algebra, a Jordan normal form of a linear operator on a finite-dimensional vector space is an upper triangular matrix of a particular form called a Jordan matrix, representing the operator with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal, and with identical diagonal entries to the left and below them.

which is equal to the set of all polynomials in

and the identity is reflexive.

Hyper-reflexivity

Let be a weak*-closed operator algebra contained in B(H), the set of all bounded operators on a Hilbert space H and for T any operator in B(H), let

Hilbert space inner product space that is metrically complete; a Banach space whose norm induces an inner product (follows the parallelogram identity)

The mathematical concept of a Hilbert space, named after David Hilbert, generalizes the notion of Euclidean space. It extends the methods of vector algebra and calculus from the two-dimensional Euclidean plane and three-dimensional space to spaces with any finite or infinite number of dimensions. A Hilbert space is an abstract vector space possessing the structure of an inner product that allows length and angle to be measured. Furthermore, Hilbert spaces are complete: there are enough limits in the space to allow the techniques of calculus to be used.

.

Observe that P is a projection involved in this supremum precisely if the range of P is an invariant subspace of .

The algebra is reflexive if and only if for every T in B(H):

.

We note that for any T in B(H) the following inequality is satisfied:

.

Here is the distance of T from the algebra, namely the smallest norm of an operator T-A where A runs over the algebra. We call hyperreflexive if there is a constant K such that for every operator T in B(H),

.

The smallest such K is called the distance constant for . A hyper-reflexive operator algebra is automatically reflexive.

In the case of a reflexive algebra of matrices with nonzero entries specified by a given pattern, the problem of finding the distance constant can be rephrased as a matrix-filling problem: if we fill the entries in the complement of the pattern with arbitrary entries, what choice of entries in the pattern gives the smallest operator norm?

Examples

In mathematics, a von Neumann algebra or W*-algebra is a *-algebra of bounded operators on a Hilbert space that is closed in the weak operator topology and contains the identity operator. It is a special type of C*-algebra.

See also

Related Research Articles

In mathematics, any vector space V has a corresponding dual vector space consisting of all linear functionals on V, together with the vector space structure of pointwise addition and scalar multiplication by constants.

Pauli matrices Matrices important in quantum mechanics and the study of spin

In mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices which are Hermitian and unitary. Usually indicated by the Greek letter sigma, they are occasionally denoted by tau when used in connection with isospin symmetries. They are

In abstract algebra, an element a of a ring R is called a left zero divisor if there exists a nonzero x such that ax = 0, or equivalently if the map from R to R that sends x to ax is not injective. Similarly, an element a of a ring is called a right zero divisor if there exists a nonzero y such that ya = 0. This is a partial case of divisibility in rings. An element that is a left or a right zero divisor is simply called a zero divisor. An element a that is both a left and a right zero divisor is called a two-sided zero divisor. If the ring is commutative, then the left and right zero divisors are the same.

Special unitary group group of unitary matrices with unit determinant

In mathematics, the special unitary group of degree n, denoted SU(n), is the Lie group of n × n unitary matrices with determinant 1.

In mathematics, and in particular linear algebra, a pseudoinverseA+ of a matrix A is a generalization of the inverse matrix. The most widely known type of matrix pseudoinverse is the Moore–Penrose inverse, which was independently described by E. H. Moore in 1920, Arne Bjerhammar in 1951, and Roger Penrose in 1955. Earlier, Erik Ivar Fredholm had introduced the concept of a pseudoinverse of integral operators in 1903. When referring to a matrix, the term pseudoinverse, without further specification, is often used to indicate the Moore–Penrose inverse. The term generalized inverse is sometimes used as a synonym for pseudoinverse.

In mathematics, a universal enveloping algebra is the most general algebra that contains all representations of a Lie algebra.

Irreducible representation Type of group and algebra representation

In mathematics, specifically in the representation theory of groups and algebras, an irreducible representation or irrep of an algebraic structure is a nonzero representation that has no proper subrepresentation closed under the action of .

In abstract algebra, a matrix ring is any collection of matrices over some ring R that form a ring under matrix addition and matrix multiplication. The set of n × n matrices with entries from R is a matrix ring denoted Mn(R), as well as some subsets of infinite matrices which form infinite matrix rings. Any subring of a matrix ring is a matrix ring.

In mathematics, a Casimir element is a distinguished element of the center of the universal enveloping algebra of a Lie algebra. A prototypical example is the squared angular momentum operator, which is a Casimir element of the three-dimensional rotation group.

In mathematics, the Lie–Kolchin theorem is a theorem in the representation theory of linear algebraic groups; Lie's theorem is the analog for linear Lie algebras.

In linear algebra, the Frobenius normal form or rational canonical form of a square matrix A with entries in a field F is a canonical form for matrices obtained by conjugation by invertible matrices over F. The form reflects a minimal decomposition of the vector space into subspaces that are cyclic for A. Since only one normal form can be reached from a given matrix, a matrix B is similar to A if and only if it has the same rational canonical form as A. Since this form can be found without any operations that might change when extending the field F, notably without factoring polynomials, this shows that whether two matrices are similar does not change upon field extensions. The form is named after German mathematician Ferdinand Georg Frobenius.

In mathematics, the Smith normal form is a normal form that can be defined for any matrix with entries in a principal ideal domain (PID). The Smith normal form of a matrix is diagonal, and can be obtained from the original matrix by multiplying on the left and right by invertible square matrices. In particular, the integers are a PID, so one can always calculate the Smith normal form of an integer matrix. The Smith normal form is very useful for working with finitely generated modules over a PID, and in particular for deducing the structure of a quotient of a free module. It is named for the British mathematician Henry John Stephen Smith.

In functional analysis a partial isometry is a linear map between Hilbert spaces such that it is an isometry on the orthogonal complement of its kernel.

Hermitian symmetric space Manifold with inversion symmetry

In mathematics, a Hermitian symmetric space is a Hermitian manifold which at every point has as an inversion symmetry preserving the Hermitian structure. First studied by Élie Cartan, they form a natural generalization of the notion of Riemannian symmetric space from real manifolds to complex manifolds.

In mathematics, the oscillator representation is a projective unitary representation of the symplectic group, first investigated by Irving Segal, David Shale, and André Weil. A natural extension of the representation leads to a semigroup of contraction operators, introduced as the oscillator semigroup by Roger Howe in 1988. The semigroup had previously been studied by other mathematicians and physicists, most notably Felix Berezin in the 1960s. The simplest example in one dimension is given by SU(1,1). It acts as Möbius transformations on the extended complex plane, leaving the unit circle invariant. In that case the oscillator representation is a unitary representation of a double cover of SU(1,1) and the oscillator semigroup corresponds to a representation by contraction operators of the semigroup in SL(2,C) corresponding to Möbius transformations that take the unit disk into itself. The contraction operators, determined only up to a sign, have kernels that are Gaussian functions. On an infinitesimal level the semigroup is described by a cone in the Lie algebra of SU(1,1) that can be identified with a light cone. The same framework generalizes to the symplectic group in higher dimensions, including its analogue in infinite dimensions. This article explains the theory for SU(1,1) in detail and summarizes how the theory can be extended.

In mathematics, an invariant convex cone is a closed convex cone in a Lie algebra of a connected Lie group that is invariant under inner automorphisms. The study of such cones was initiated by Ernest Vinberg and Bertram Kostant.

A coherent algebra is an algebra of complex square matrices that is closed under ordinary matrix multiplication, Schur product, transposition, and contains both the identity matrix and the all-ones matrix .

References