Polarization of an algebraic form

Last updated

In mathematics, in particular in algebra, polarization is a technique for expressing a homogeneous polynomial in a simpler fashion by adjoining more variables. Specifically, given a homogeneous polynomial, polarization produces a multilinear form from which the original polynomial can be recovered by evaluating along a certain diagonal.

Contents

Although the technique is deceptively simple, it has applications in many areas of abstract mathematics: in particular to algebraic geometry, invariant theory, and representation theory. Polarization and related techniques form the foundations for Weyl's invariant theory.

The technique

The fundamental ideas are as follows. Let be a polynomial in variables Suppose that is homogeneous of degree which means that

Let be a collection of indeterminates with so that there are variables altogether. The polar form of is a polynomial

which is linear separately in each (that is, is multilinear), symmetric in the and such that

The polar form of is given by the following construction

In other words, is a constant multiple of the coefficient of in the expansion of

Examples

A quadratic example. Suppose that and is the quadratic form

Then the polarization of is a function in and given by

More generally, if is any quadratic form then the polarization of agrees with the conclusion of the polarization identity.

A cubic example. Let Then the polarization of is given by

Mathematical details and consequences

The polarization of a homogeneous polynomial of degree is valid over any commutative ring in which is a unit. In particular, it holds over any field of characteristic zero or whose characteristic is strictly greater than

The polarization isomorphism (by degree)

For simplicity, let be a field of characteristic zero and let be the polynomial ring in variables over Then is graded by degree, so that

The polarization of algebraic forms then induces an isomorphism of vector spaces in each degree

where is the -th symmetric power of the -dimensional space

These isomorphisms can be expressed independently of a basis as follows. If is a finite-dimensional vector space and is the ring of -valued polynomial functions on graded by homogeneous degree, then polarization yields an isomorphism

The algebraic isomorphism

Furthermore, the polarization is compatible with the algebraic structure on so that

where is the full symmetric algebra over

Remarks

See also

Related Research Articles

In mathematics, a linear map is a mapping between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism.

In mathematics, the Cauchy–Schwarz inequality, is a useful inequality in many mathematical fields, such as linear algebra, analysis, probability theory, vector algebra and other areas. It is considered to be one of the most important inequalities in all of mathematics.

Symmetric matrix Matrix equal to its transpose

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,

In mathematics, a quadric or quadric surface, is a generalization of conic sections. It is a hypersurface in a (D + 1)-dimensional space, and it is defined as the zero set of an irreducible polynomial of degree two in D + 1 variables. When the defining polynomial is not absolutely irreducible, the zero set is generally not considered a quadric, although it is often called a degenerate quadric or a reducible quadric.

In mathematics, a recurrence relation is an equation that recursively defines a sequence or multidimensional array of values, once one or more initial terms of the same function are given; each further term of the sequence or array is defined as a function of the preceding terms of the same function.

Cayley–Hamilton theorem Every square matrix over a commutative ring satisfies its own characteristic equation

In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.

Unitary group Group of unitary matrices

In mathematics, the unitary group of degree n, denoted U(n), is the group of n × n unitary matrices, with the group operation of matrix multiplication. The unitary group is a subgroup of the general linear group GL(n, C). Hyperorthogonal group is an archaic name for the unitary group, especially over finite fields. For the group of unitary matrices with determinant 1, see Special unitary group.

In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition

Exterior algebra Algebraic construction used in multilinear algebra and geometry

In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors and , denoted by , is called a bivector and lives in a space called the exterior square, a vector space that is distinct from the original space of vectors. The magnitude of can be interpreted as the area of the parallelogram with sides and , which in three dimensions can also be computed using the cross product of the two vectors. More generally, all parallel plane surfaces with the same orientation and area have the same bivector as a measure of their oriented area. Like the cross product, the exterior product is anticommutative, meaning that for all vectors and , but, unlike the cross product, the exterior product is associative.

In mathematics, the Hessian matrix or Hessian is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. Hesse originally used the term "functional determinants".

In mathematics, a canonical basis is a basis of an algebraic structure that is canonical in a sense that depends on the precise context:

In the mathematical field of graph theory, the Laplacian matrix, also called the graph Laplacian, admittance matrix, Kirchhoff matrix or discrete Laplacian, is a matrix representation of a graph. The Laplacian matrix can be used to find many useful properties of a graph. Together with Kirchhoff's theorem, it can be used to calculate the number of spanning trees for a given graph. The sparsest cut of a graph can be approximated through the second smallest eigenvalue of its Laplacian by Cheeger's inequality. It can also be used to construct low dimensional embeddings, which can be useful for a variety of machine learning applications.

In mathematics, a homogeneous polynomial, sometimes called quantic in older texts, is a polynomial whose nonzero terms all have the same degree. For example, is a homogeneous polynomial of degree 5, in two variables; the sum of the exponents in each term is always 5. The polynomial is not homogeneous, because the sum of exponents does not match from term to term. A polynomial is homogeneous if and only if it defines a homogeneous function.

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by , is the factor by which the eigenvector is scaled.

In mathematics, the resultant of two polynomials is a polynomial expression of their coefficients, which is equal to zero if and only if the polynomials have a common root, or, equivalently, a common factor. In some older texts, the resultant is also called the eliminant.

In algebraic geometry, Proj is a construction analogous to the spectrum-of-a-ring construction of affine schemes, which produces objects with the typical properties of projective spaces and projective varieties. The construction, while not functorial, is a fundamental tool in scheme theory.

The expander mixing lemma intuitively states that the edges of certain -regular graphs are evenly distributed throughout the graph. In particular, the number of edges between two vertex subsets and is always close to the expected number of edges between them in a random -regular graph, namely .

In linear algebra, eigendecomposition or sometimes spectral decomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way.

For certain applications in linear algebra, it is useful to know properties of the probability distribution of the largest eigenvalue of a finite sum of random matrices. Suppose is a finite sequence of random matrices. Analogous to the well-known Chernoff bound for sums of scalars, a bound on the following is sought for a given parameter t:

Faddeev–LeVerrier algorithm

In mathematics, the Faddeev–LeVerrier algorithm is a recursive method to calculate the coefficients of the characteristic polynomial of a square matrix, A, named after Dmitry Konstantinovich Faddeev and Urbain Le Verrier. Calculation of this polynomial yields the eigenvalues of A as its roots; as a matrix polynomial in the matrix A itself, it vanishes by the fundamental Cayley–Hamilton theorem. Computing determinant from the definition of characteristic polynomial, however, is computationally cumbersome, because is new symbolic quantity, whereas this algorithm works directly with coefficients of matrix .

References