Ring of polynomial functions

Last updated

In mathematics, the ring of polynomial functions on a vector space V over a field k gives a coordinate-free analog of a polynomial ring. It is denoted by k[V]. If V is finite dimensional and is viewed as an algebraic variety, then k[V] is precisely the coordinate ring of V.

Contents

The explicit definition of the ring can be given as follows. If is a polynomial ring, then we can view as coordinate functions on ; i.e., when This suggests the following: given a vector space V, let k[V] be the commutative k-algebra generated by the dual space , which is a subring of the ring of all functions . If we fix a basis for V and write for its dual basis, then k[V] consists of polynomials in .

If k is infinite, then k[V] is the symmetric algebra of the dual space .

In applications, one also defines k[V] when V is defined over some subfield of k (e.g., k is the complex field and V is a real vector space.) The same definition still applies.

Throughout the article, for simplicity, the base field k is assumed to be infinite.

Relation with polynomial ring

Let be the set of all polynomials over a field K and B be the set of all polynomial functions in one variable over K. Both A and B are algebras over K given by the standard multiplication and addition of polynomials and functions. We can map each in A to in B by the rule . A routine check shows that the mapping is a homomorphism of the algebras A and B. This homomorphism is an isomorphism if and only if K is an infinite field. For example, if K is a finite field then let . p is a nonzero polynomial in K[x], however for all t in K, so is the zero function and our homomorphism is not an isomorphism (and, actually, the algebras are not isomorphic, since the algebra of polynomials is infinite while that of polynomial functions is finite).

If K is infinite then choose a polynomial f such that . We want to show this implies that . Let and let be n+1 distinct elements of K. Then for and by Lagrange interpolation we have . Hence the mapping is injective. Since this mapping is clearly surjective, it is bijective and thus an algebra isomorphism of A and B.

Symmetric multilinear maps

Let k be an infinite field of characteristic zero (or at least very large) and V a finite-dimensional vector space.

Let denote the vector space of multilinear functionals that are symmetric; is the same for all permutations of 's.

Any λ in gives rise to a homogeneous polynomial function f of degree q: we just let To see that f is a polynomial function, choose a basis of V and its dual. Then

,

which implies f is a polynomial in the ti's.

Thus, there is a well-defined linear map:

We show it is an isomorphism. Choosing a basis as before, any homogeneous polynomial function f of degree q can be written as:

where are symmetric in . Let

Clearly, is the identity; in particular, φ is surjective. To see φ is injective, suppose φ(λ) = 0. Consider

,

which is zero. The coefficient of t1t2tq in the above expression is q! times λ(v1, …, vq); it follows that λ = 0.

Note: φ is independent of a choice of basis; so the above proof shows that ψ is also independent of a basis, the fact not a priori obvious.

Example: A bilinear functional gives rise to a quadratic form in a unique way and any quadratic form arises in this way.

Taylor series expansion

Given a smooth function, locally, one can get a partial derivative of the function from its Taylor series expansion and, conversely, one can recover the function from the series expansion. This fact continues to hold for polynomials functions on a vector space. If f is in k[V], then we write: for x, y in V,

where gn(x, y) are homogeneous of degree n in y, and only finitely many of them are nonzero. We then let

resulting in the linear endomorphism Py of k[V]. It is called the polarization operator. We then have, as promised:

Theorem  For each f in k[V] and x, y in V,

.

Proof: We first note that (Pyf) (x) is the coefficient of t in f(x + ty); in other words, since g0(x, y) = g0(x, 0) = f(x),

where the right-hand side is, by definition,

The theorem follows from this. For example, for n=2, we have:

The general case is similar.

Operator product algebra

When the polynomials are valued not over a field k, but over some algebra, then one may define additional structure. Thus, for example, one may consider the ring of functions over GL(n,m) , instead of for k = GL(1,m).[ clarification needed ] In this case, one may impose an additional axiom.

The operator product algebra is an associative algebra of the form

The structure constants are required to be single-valued functions, rather than sections of some vector bundle. The fields (or operators) are required to span the ring of functions. In practical calculations, it is usually required that the sums be analytic within some radius of convergence; typically with a radius of convergence of . Thus, the ring of functions can be taken to be the ring of polynomial functions.

The above can be considered to be an additional requirement imposed on the ring; it is sometimes called the bootstrap. In physics, a special case of the operator product algebra is known as the operator product expansion.

See also

Notes

    Related Research Articles

    In mathematics, the tensor product of two vector spaces V and W is a vector space to which is associated a bilinear map that maps a pair to an element of denoted

    In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized. This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

    <span class="mw-page-title-main">Cayley–Hamilton theorem</span> Every square matrix over a commutative ring satisfies its own characteristic equation

    In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.

    <span class="mw-page-title-main">Unitary group</span> Group of unitary matrices

    In mathematics, the unitary group of degree n, denoted U(n), is the group of n × n unitary matrices, with the group operation of matrix multiplication. The unitary group is a subgroup of the general linear group GL(n, C). Hyperorthogonal group is an archaic name for the unitary group, especially over finite fields. For the group of unitary matrices with determinant 1, see Special unitary group.

    In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base. The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.

    <span class="mw-page-title-main">Affine space</span> Euclidean space without distance and angles

    In mathematics, an affine space is a geometric structure that generalizes some of the properties of Euclidean spaces in such a way that these are independent of the concepts of distance and measure of angles, keeping only the properties related to parallelism and ratio of lengths for parallel line segments.

    <span class="mw-page-title-main">Jordan normal form</span> Form of a matrix indicating its eigenvalues and their algebraic multiplicities

    In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal, and with identical diagonal entries to the left and below them.

    <span class="mw-page-title-main">Projective variety</span>

    In algebraic geometry, a projective variety over an algebraically closed field k is a subset of some projective n-space over k that is the zero-locus of some finite family of homogeneous polynomials of n + 1 variables with coefficients in k, that generate a prime ideal, the defining ideal of the variety. Equivalently, an algebraic variety is projective if it can be embedded as a Zariski closed subvariety of .

    In linear algebra, the Frobenius companion matrix of the monic polynomial

    In mathematics, the Schwarzian derivative is an operator similar to the derivative which is invariant under Möbius transformations. Thus, it occurs in the theory of the complex projective line, and in particular, in the theory of modular forms and hypergeometric functions. It plays an important role in the theory of univalent functions, conformal mapping and Teichmüller spaces. It is named after the German mathematician Hermann Schwarz.

    In mathematics, the jet is an operation that takes a differentiable function f and produces a polynomial, the truncated Taylor polynomial of f, at each point of its domain. Although this is the definition of a jet, the theory of jets regards these polynomials as being abstract polynomials rather than polynomial functions.

    In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor.

    In mathematics, the resultant of two polynomials is a polynomial expression of their coefficients that is equal to zero if and only if the polynomials have a common root, or, equivalently, a common factor. In some older texts, the resultant is also called the eliminant.

    In mathematics, in particular in algebra, polarization is a technique for expressing a homogeneous polynomial in a simpler fashion by adjoining more variables. Specifically, given a homogeneous polynomial, polarization produces a unique symmetric multilinear form from which the original polynomial can be recovered by evaluating along a certain diagonal.

    In mathematics, the Jordan–Chevalley decomposition, named after Camille Jordan and Claude Chevalley, expresses a linear operator as the sum of its commuting semisimple part and its nilpotent part. The multiplicative decomposition expresses an invertible operator as the product of its commuting semisimple and unipotent parts. The decomposition is easy to describe when the Jordan normal form of the operator is given, but it exists under weaker hypotheses than the existence of a Jordan normal form. Analogues of the Jordan-Chevalley decomposition exist for elements of linear algebraic groups, Lie algebras, and Lie groups, and the decomposition is an important tool in the study of these objects.

    In mathematics, especially in the field of representation theory, Schur functors are certain functors from the category of modules over a fixed commutative ring to itself. They generalize the constructions of exterior powers and symmetric powers of a vector space. Schur functors are indexed by Young diagrams in such a way that the horizontal diagram with n cells corresponds to the nth symmetric power functor, and the vertical diagram with n cells corresponds to the nth exterior power functor. If a vector space V is a representation of a group G, then also has a natural action of G for any Schur functor .

    In algebra, a multivariate polynomial

    In algebra, a λ-ring or lambda ring is a commutative ring together with some operations λn on it that behave like the exterior powers of vector spaces. Many rings considered in K-theory carry a natural λ-ring structure. λ-rings also provide a powerful formalism for studying an action of the symmetric functions on the ring of polynomials, recovering and extending many classical results.

    In mathematics, a representation on coordinate rings is a representation of a group on coordinate rings of affine varieties.

    In algebra, a polynomial functor is an endofunctor on the category of finite-dimensional vector spaces that depends polynomially on vector spaces. For example, the symmetric powers and the exterior powers are polynomial functors from to ; these two are also Schur functors.

    References