# Functional calculus

Last updated

In mathematics, a functional calculus is a theory allowing one to apply mathematical functions to mathematical operators. It is now a branch (more accurately, several related areas) of the field of functional analysis, connected with spectral theory. (Historically, the term was also used synonymously with calculus of variations; this usage is obsolete, except for functional derivative. Sometimes it is used in relation to types of functional equations, or in logic for systems of predicate calculus.)

## Contents

If ${\displaystyle f}$ is a function, say a numerical function of a real number, and ${\displaystyle M}$ is an operator, there is no particular reason why the expression ${\displaystyle f(M)}$ should make sense. If it does, then we are no longer using ${\displaystyle f}$ on its original function domain. In the tradition of operational calculus, algebraic expressions in operators are handled irrespective of their meaning. This passes nearly unnoticed if we talk about 'squaring a matrix', though, which is the case of ${\displaystyle f(x)=x^{2}}$ and ${\displaystyle M}$ an ${\displaystyle n\times n}$ matrix. The idea of a functional calculus is to create a principled approach to this kind of overloading of the notation.

The most immediate case is to apply polynomial functions to a square matrix, extending what has just been discussed. In the finite-dimensional case, the polynomial functional calculus yields quite a bit of information about the operator. For example, consider the family of polynomials which annihilates an operator ${\displaystyle T}$. This family is an ideal in the ring of polynomials. Furthermore, it is a nontrivial ideal: let ${\displaystyle n}$ be the finite dimension of the algebra of matrices, then ${\displaystyle \{I,T,T^{2},\ldots ,T^{n}\}}$ is linearly dependent. So ${\displaystyle \sum _{i=0}^{n}\alpha _{i}T^{i}=0}$ for some scalars ${\displaystyle \alpha _{i}}$, not all equal to 0. This implies that the polynomial ${\displaystyle \sum _{i=0}^{n}\alpha _{i}x^{i}}$ lies in the ideal. Since the ring of polynomials is a principal ideal domain, this ideal is generated by some polynomial ${\displaystyle m}$. Multiplying by a unit if necessary, we can choose ${\displaystyle m}$ to be monic. When this is done, the polynomial ${\displaystyle m}$ is precisely the minimal polynomial of ${\displaystyle T}$. This polynomial gives deep information about ${\displaystyle T}$. For instance, a scalar ${\displaystyle \alpha }$ is an eigenvalue of ${\displaystyle T}$ if and only if ${\displaystyle \alpha }$ is a root of ${\displaystyle m}$. Also, sometimes ${\displaystyle m}$ can be used to calculate the exponential of ${\displaystyle T}$ efficiently.

The polynomial calculus is not as informative in the infinite-dimensional case. Consider the unilateral shift with the polynomials calculus; the ideal defined above is now trivial. Thus one is interested in functional calculi more general than polynomials. The subject is closely linked to spectral theory, since for a diagonal matrix or multiplication operator, it is rather clear what the definitions should be.

## Related Research Articles

In mathematics, any vector space has a corresponding dual vector space consisting of all linear forms on , together with the vector space structure of pointwise addition and scalar multiplication by constants.

In mathematics, a linear map is a mapping between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism.

Linear algebra is the branch of mathematics concerning linear equations such as:

In mathematics, an operator is generally a mapping or function that acts on elements of a space to produce elements of another space. There is no general definition of an operator, but the term is often used in place of function when the domain is a set of functions or other structured objects. Also, the domain of an operator is often difficult to be explicitly characterized, and may be extended to related objects. See Operator (physics) for other examples.

A vector space is a set of objects called vectors, which may be added together and multiplied ("scaled") by numbers, called scalars. Scalars are often taken to be real numbers, but there are also vector spaces with scalar multiplication by complex numbers, rational numbers, or generally any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called vector axioms. To specify that the scalars are real or complex numbers, the terms real vector space and complex vector space are often used.

In algebra and algebraic geometry, the spectrum of a commutative ring R, denoted by , is the set of all prime ideals of R. It is commonly augmented with the Zariski topology and with a structure sheaf, turning it into a locally ringed space. A locally ringed space of this form is called an affine scheme.

In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized. This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

In mathematics, a self-adjoint operator on a finite-dimensional complex vector space V with inner product is a linear map A that is its own adjoint: for all vectors v and w. If V is finite-dimensional with a given orthonormal basis, this is equivalent to the condition that the matrix of A is a Hermitian matrix, i.e., equal to its conjugate transpose A. By the finite-dimensional spectral theorem, V has an orthonormal basis such that the matrix of A relative to this basis is a diagonal matrix with entries in the real numbers. In this article, we consider generalizations of this concept to operators on Hilbert spaces of arbitrary dimension.

In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors and , denoted by , is called a bivector and lives in a space called the exterior square, a vector space that is distinct from the original space of vectors. The magnitude of can be interpreted as the area of the parallelogram with sides and , which in three dimensions can also be computed using the cross product of the two vectors. More generally, all parallel plane surfaces with the same orientation and area have the same bivector as a measure of their oriented area. Like the cross product, the exterior product is anticommutative, meaning that for all vectors and , but, unlike the cross product, the exterior product is associative.

In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function.

In linear algebra, a Jordan normal form, also known as a Jordan canonical form or JCF, is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal, and with identical diagonal entries to the left and below them.

In mathematics, especially in the field of algebra, a polynomial ring or polynomial algebra is a ring formed from the set of polynomials in one or more indeterminates with coefficients in another ring, often a field.

In mathematics, a linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form

In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operators or closed operators, and consideration may be given to nonlinear operators. The study, which depends heavily on the topology of function spaces, is a branch of functional analysis.

In functional analysis, a branch of mathematics, the Borel functional calculus is a functional calculus, which has particularly broad scope. Thus for instance if T is an operator, applying the squaring function ss2 to T yields the operator T2. Using the functional calculus for larger classes of functions, we can for example define rigorously the "square root" of the (negative) Laplacian operator −Δ or the exponential