Differential operator

Last updated

A harmonic function defined on an annulus. Harmonic functions are exactly those functions which lie in the kernel of the Laplace operator, an important differential operator. Laplace's equation on an annulus.svg
A harmonic function defined on an annulus. Harmonic functions are exactly those functions which lie in the kernel of the Laplace operator, an important differential operator.

In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function (in the style of a higher-order function in computer science).

Contents

This article considers mainly linear differential operators, which are the most common type. However, non-linear differential operators also exist, such as the Schwarzian derivative.

Definition

Given a nonnegative integer m, an order- linear differential operator is a map from a function space to another function space that can be written as:

where is a multi-index of non-negative integers, , and for each , is a function on some open domain in n-dimensional space. The operator is interpreted as

Thus for a function :

The notation is justified (i.e., independent of order of differentiation) because of the symmetry of second derivatives.

The polynomial p obtained by replacing D by variables in P is called the total symbol of P; i.e., the total symbol of P above is:

where The highest homogeneous component of the symbol, namely,

is called the principal symbol of P. While the total symbol is not intrinsically defined, the principal symbol is intrinsically defined (i.e., it is a function on the cotangent bundle). [1]

More generally, let E and F be vector bundles over a manifold X. Then the linear operator

is a differential operator of order if, in local coordinates on X, we have

where, for each multi-index α, is a bundle map, symmetric on the indices α.

The kth order coefficients of P transform as a symmetric tensor

whose domain is the tensor product of the kth symmetric power of the cotangent bundle of X with E, and whose codomain is F. This symmetric tensor is known as the principal symbol (or just the symbol) of P.

The coordinate system xi permits a local trivialization of the cotangent bundle by the coordinate differentials dxi, which determine fiber coordinates ξi. In terms of a basis of frames eμ, fν of E and F, respectively, the differential operator P decomposes into components

on each section u of E. Here Pνμ is the scalar differential operator defined by

With this trivialization, the principal symbol can now be written

In the cotangent space over a fixed point x of X, the symbol defines a homogeneous polynomial of degree k in with values in .

Fourier interpretation

A differential operator P and its symbol appear naturally in connection with the Fourier transform as follows. Let ƒ be a Schwartz function. Then by the inverse Fourier transform,

This exhibits P as a Fourier multiplier. A more general class of functions p(x,ξ) which satisfy at most polynomial growth conditions in ξ under which this integral is well-behaved comprises the pseudo-differential operators.

Examples

Del defines the gradient, and is used to calculate the curl, divergence, and Laplacian of various objects.

History

The conceptual step of writing a differential operator as something free-standing is attributed to Louis François Antoine Arbogast in 1800. [2]

Notations

The most common differential operator is the action of taking the derivative. Common notations for taking the first derivative with respect to a variable x include:

, , and .

When taking higher, nth order derivatives, the operator may be written:

, , , or .

The derivative of a function f of an argument x is sometimes given as either of the following:

The D notation's use and creation is credited to Oliver Heaviside, who considered differential operators of the form

in his study of differential equations.

One of the most frequently seen differential operators is the Laplacian operator, defined by

Another differential operator is the Θ operator, or theta operator, defined by [3]

This is sometimes also called the homogeneity operator, because its eigenfunctions are the monomials in z:

In n variables the homogeneity operator is given by

As in one variable, the eigenspaces of Θ are the spaces of homogeneous functions. (Euler's homogeneous function theorem)

In writing, following common mathematical convention, the argument of a differential operator is usually placed on the right side of the operator itself. Sometimes an alternative notation is used: The result of applying the operator to the function on the left side of the operator and on the right side of the operator, and the difference obtained when applying the differential operator to the functions on both sides, are denoted by arrows as follows:

Such a bidirectional-arrow notation is frequently used for describing the probability current of quantum mechanics.

Adjoint of an operator

Given a linear differential operator

the adjoint of this operator is defined as the operator such that

where the notation is used for the scalar product or inner product. This definition therefore depends on the definition of the scalar product (or inner product).

Formal adjoint in one variable

In the functional space of square-integrable functions on a real interval (a, b), the scalar product is defined by

where the line over f(x) denotes the complex conjugate of f(x). If one moreover adds the condition that f or g vanishes as and , one can also define the adjoint of T by

This formula does not explicitly depend on the definition of the scalar product. It is therefore sometimes chosen as a definition of the adjoint operator. When is defined according to this formula, it is called the formal adjoint of T.

A (formally) self-adjoint operator is an operator equal to its own (formal) adjoint.

Several variables

If Ω is a domain in Rn, and P a differential operator on Ω, then the adjoint of P is defined in L2(Ω) by duality in the analogous manner:

for all smooth L2 functions f, g. Since smooth functions are dense in L2, this defines the adjoint on a dense subset of L2: P* is a densely defined operator.

Example

The SturmLiouville operator is a well-known example of a formal self-adjoint operator. This second-order linear differential operator L can be written in the form

This property can be proven using the formal adjoint definition above. [4]

This operator is central to Sturm–Liouville theory where the eigenfunctions (analogues to eigenvectors) of this operator are considered.

Properties of differential operators

Differentiation is linear, i.e.

where f and g are functions, and a is a constant.

Any polynomial in D with function coefficients is also a differential operator. We may also compose differential operators by the rule

Some care is then required: firstly any function coefficients in the operator D2 must be differentiable as many times as the application of D1 requires. To get a ring of such operators we must assume derivatives of all orders of the coefficients used. Secondly, this ring will not be commutative: an operator gD isn't the same in general as Dg. For example we have the relation basic in quantum mechanics:

The subring of operators that are polynomials in D with constant coefficients is, by contrast, commutative. It can be characterised another way: it consists of the translation-invariant operators.

The differential operators also obey the shift theorem.

Ring of polynomial differential operators

Ring of univariate polynomial differential operators

If R is a ring, let be the non-commutative polynomial ring over R in the variables D and X, and I the two-sided ideal generated by DXXD − 1. Then the ring of univariate polynomial differential operators over R is the quotient ring . This is a non-commutative simple ring. Every element can be written in a unique way as a R-linear combination of monomials of the form . It supports an analogue of Euclidean division of polynomials.

Differential modules[ clarification needed ] over (for the standard derivation) can be identified with modules over .

Ring of multivariate polynomial differential operators

If R is a ring, let be the non-commutative polynomial ring over R in the variables , and I the two-sided ideal generated by the elements

for all where is Kronecker delta. Then the ring of multivariate polynomial differential operators over R is the quotient ring .

This is a non-commutative simple ring. Every element can be written in a unique way as a R-linear combination of monomials of the form .

Coordinate-independent description

In differential geometry and algebraic geometry it is often convenient to have a coordinate-independent description of differential operators between two vector bundles. Let E and F be two vector bundles over a differentiable manifold M. An R-linear mapping of sections P : Γ(E) → Γ(F) is said to be a kth-order linear differential operator if it factors through the jet bundle Jk(E). In other words, there exists a linear mapping of vector bundles

such that

where jk: Γ(E) → Γ(Jk(E)) is the prolongation that associates to any section of E its k-jet.

This just means that for a given section s of E, the value of P(s) at a point x  M is fully determined by the kth-order infinitesimal behavior of s in x. In particular this implies that P(s)(x) is determined by the germ of s in x, which is expressed by saying that differential operators are local. A foundational result is the Peetre theorem showing that the converse is also true: any (linear) local operator is differential.

Relation to commutative algebra

An equivalent, but purely algebraic description of linear differential operators is as follows: an R-linear map P is a kth-order linear differential operator, if for any k +1 smooth functions we have

Here the bracket is defined as the commutator

This characterization of linear differential operators shows that they are particular mappings between modules over a commutative algebra, allowing the concept to be seen as a part of commutative algebra.

Variants

A differential operator of infinite order

A differential operator of infinite order is (roughly) a differential operator whose total symbol is a power series instead of a polynomial.

Bidifferential operator

A differential operator acting on two functions is called a bidifferential operator. The notion appears, for instance, in an associative algebra structure on a deformation quantization of a Poisson algebra. [5]

Microdifferential operator

A microdifferential operator is a type of operator on an open subset of a cotangent bundle, as opposed to an open subset of a manifold. It is obtained by extending the notion of a differential operator to the cotangent bundle. [6]

See also

Related Research Articles

Bra–ket notation, also called Dirac notation, is a notation for linear algebra and linear operators on complex vector spaces together with their dual space both in the finite-dimensional and infinite-dimensional case. It is specifically designed to ease the types of calculations that frequently come up in quantum mechanics. Its use in quantum mechanics is quite widespread.

<span class="mw-page-title-main">Dirac delta function</span> Generalized function whose value is zero everywhere except at zero

In mathematical physics, the Dirac delta distribution, also known as the unit impulse, is a generalized function or distribution over the real numbers, whose value is zero everywhere except at zero, and whose integral over the entire real line is equal to one.

Distributions, also known as Schwartz distributions or generalized functions, are objects that generalize the classical notion of functions in mathematical analysis. Distributions make it possible to differentiate functions whose derivatives do not exist in the classical sense. In particular, any locally integrable function has a distributional derivative.

In mathematics, a self-adjoint operator on an infinite-dimensional complex vector space V with inner product is a linear map A that is its own adjoint. If V is finite-dimensional with a given orthonormal basis, this is equivalent to the condition that the matrix of A is a Hermitian matrix, i.e., equal to its conjugate transpose A. By the finite-dimensional spectral theorem, V has an orthonormal basis such that the matrix of A relative to this basis is a diagonal matrix with entries in the real numbers. This article deals with applying generalizations of this concept to operators on Hilbert spaces of arbitrary dimension.

In mathematics, the Hodge star operator or Hodge star is a linear map defined on the exterior algebra of a finite-dimensional oriented vector space endowed with a nondegenerate symmetric bilinear form. Applying the operator to an element of the algebra produces the Hodge dual of the element. This map was introduced by W. V. D. Hodge.

In mathematics, the covariant derivative is a way of specifying a derivative along tangent vectors of a manifold. Alternatively, the covariant derivative is a way of introducing and working with a connection on a manifold by means of a differential operator, to be contrasted with the approach given by a principal connection on the frame bundle – see affine connection. In the special case of a manifold isometrically embedded into a higher-dimensional Euclidean space, the covariant derivative can be viewed as the orthogonal projection of the Euclidean directional derivative onto the manifold's tangent space. In this case the Euclidean derivative is broken into two parts, the extrinsic normal component and the intrinsic covariant derivative component.

<span class="mw-page-title-main">Onsager reciprocal relations</span> Relations between flows and forces, or gradients, in thermodynamic systems

In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists.

In mathematics and its applications, a Sturm–Liouville problem is a second-order linear ordinary differential equation of the form:

<span class="mw-page-title-main">Radon transform</span> Integral transform

In mathematics, the Radon transform is the integral transform which takes a function f defined on the plane to a function Rf defined on the (two-dimensional) space of lines in the plane, whose value at a particular line is equal to the line integral of the function over that line. The transform was introduced in 1917 by Johann Radon, who also provided a formula for the inverse transform. Radon further included formulas for the transform in three dimensions, in which the integral is taken over planes. It was later generalized to higher-dimensional Euclidean spaces and more broadly in the context of integral geometry. The complex analogue of the Radon transform is known as the Penrose transform. The Radon transform is widely applicable to tomography, the creation of an image from the projection data associated with cross-sectional scans of an object.

<span class="mw-page-title-main">Elliptic operator</span> Type of differential operator

In the theory of partial differential equations, elliptic operators are differential operators that generalize the Laplace operator. They are defined by the condition that the coefficients of the highest-order derivatives be positive, which implies the key property that the principal symbol is invertible, or equivalently that there are no real characteristic directions.

In mathematics, specifically in operator theory, each linear operator on an inner product space defines a Hermitian adjoint operator on that space according to the rule

In functional analysis, the Friedrichs extension is a canonical self-adjoint extension of a non-negative densely defined symmetric operator. It is named after the mathematician Kurt Friedrichs. This extension is particularly useful in situations where an operator may fail to be essentially self-adjoint or whose essential self-adjointness is difficult to show.

In mathematical analysis a pseudo-differential operator is an extension of the concept of differential operator. Pseudo-differential operators are used extensively in the theory of partial differential equations and quantum field theory, e.g. in mathematical models that include ultrametric pseudo-differential equations in a non-Archimedean space.

<span class="mw-page-title-main">LSZ reduction formula</span> Connection between correlation functions and the S-matrix

In quantum field theory, the Lehmann–Symanzik–Zimmermann (LSZ) reduction formula is a method to calculate S-matrix elements from the time-ordered correlation functions of a quantum field theory. It is a step of the path that starts from the Lagrangian of some quantum field theory and leads to prediction of measurable quantities. It is named after the three German physicists Harry Lehmann, Kurt Symanzik and Wolfhart Zimmermann.

In differential geometry, the Laplace–Beltrami operator is a generalization of the Laplace operator to functions defined on submanifolds in Euclidean space and, even more generally, on Riemannian and pseudo-Riemannian manifolds. It is named after Pierre-Simon Laplace and Eugenio Beltrami.

In mathematical analysis an oscillatory integral is a type of distribution. Oscillatory integrals make rigorous many arguments that, on a naive level, appear to use divergent integrals. It is possible to represent approximate solution operators for many differential equations as oscillatory integrals.

In mathematics, a holomorphic vector bundle is a complex vector bundle over a complex manifold X such that the total space E is a complex manifold and the projection map π : EX is holomorphic. Fundamental examples are the holomorphic tangent bundle of a complex manifold, and its dual, the holomorphic cotangent bundle. A holomorphic line bundle is a rank one holomorphic vector bundle.

In mathematics, in particular in differential geometry, mathematical physics, and representation theory a Weitzenböck identity, named after Roland Weitzenböck, expresses a relationship between two second-order elliptic operators on a manifold with the same principal symbol. Usually Weitzenböck formulae are implemented for G-invariant self-adjoint operators between vector bundles associated to some principal G-bundle, although the precise conditions under which such a formula exists are difficult to formulate. This article focuses on three examples of Weitzenböck identities: from Riemannian geometry, spin geometry, and complex analysis.

In many-body theory, the term Green's function is sometimes used interchangeably with correlation function, but refers specifically to correlators of field operators or creation and annihilation operators.

In mathematics, Gårding's inequality is a result that gives a lower bound for the bilinear form induced by a real linear elliptic partial differential operator. The inequality is named after Lars Gårding.

References

  1. Schapira 1985 , 1.1.7
  2. James Gasser (editor), A Boole Anthology: Recent and classical studies in the logic of George Boole (2000), p. 169; Google Books.
  3. E. W. Weisstein. "Theta Operator" . Retrieved 2009-06-12.
  4. Omori, Hideki; Maeda, Y.; Yoshioka, A. (1992). "Deformation quantization of Poisson algebras". Proceedings of the Japan Academy, Series A, Mathematical Sciences. 68 (5). doi: 10.3792/PJAA.68.97 . S2CID   119540529.
  5. Schapira 1985 , § 1.2. § 1.3.

Further reading