Power sum symmetric polynomial

Last updated

In mathematics, specifically in commutative algebra, the power sum symmetric polynomials are a type of basic building block for symmetric polynomials, in the sense that every symmetric polynomial with rational coefficients can be expressed as a sum and difference of products of power sum symmetric polynomials with rational coefficients. However, not every symmetric polynomial with integral coefficients is generated by integral combinations of products of power-sum polynomials: they are a generating set over the rationals, but not over the integers.

Contents

Definition

The power sum symmetric polynomial of degree k in variables x1, ..., xn, written pk for k = 0, 1, 2, ..., is the sum of all kth powers of the variables. Formally,

The first few of these polynomials are

Thus, for each nonnegative integer , there exists exactly one power sum symmetric polynomial of degree in variables.

The polynomial ring formed by taking all integral linear combinations of products of the power sum symmetric polynomials is a commutative ring.

Examples

The following lists the power sum symmetric polynomials of positive degrees up to n for the first three positive values of In every case, is one of the polynomials. The list goes up to degree n because the power sum symmetric polynomials of degrees 1 to n are basic in the sense of the theorem stated below.

For n = 1:

For n = 2:

For n = 3:

Properties

The set of power sum symmetric polynomials of degrees 1, 2, ..., n in n variables generates the ring of symmetric polynomials in n variables. More specifically:

Theorem. The ring of symmetric polynomials with rational coefficients equals the rational polynomial ring The same is true if the coefficients are taken in any field of characteristic 0.

However, this is not true if the coefficients must be integers. For example, for n = 2, the symmetric polynomial

has the expression

which involves fractions. According to the theorem this is the only way to represent in terms of p1 and p2. Therefore, P does not belong to the integral polynomial ring For another example, the elementary symmetric polynomials ek, expressed as polynomials in the power sum polynomials, do not all have integral coefficients. For instance,

The theorem is also untrue if the field has characteristic different from 0. For example, if the field F has characteristic 2, then , so p1 and p2 cannot generate e2 = x1x2.

Sketch of a partial proof of the theorem: By Newton's identities the power sums are functions of the elementary symmetric polynomials; this is implied by the following recurrence relation, though the explicit function that gives the power sums in terms of the ej is complicated:

Rewriting the same recurrence, one has the elementary symmetric polynomials in terms of the power sums (also implicitly, the explicit formula being complicated):

This implies that the elementary polynomials are rational, though not integral, linear combinations of the power sum polynomials of degrees 1, ..., n. Since the elementary symmetric polynomials are an algebraic basis for all symmetric polynomials with coefficients in a field, it follows that every symmetric polynomial in n variables is a polynomial function of the power sum symmetric polynomials p1, ..., pn. That is, the ring of symmetric polynomials is contained in the ring generated by the power sums, Because every power sum polynomial is symmetric, the two rings are equal.

(This does not show how to prove the polynomial f is unique.)

For another system of symmetric polynomials with similar properties see complete homogeneous symmetric polynomials.

See also

Related Research Articles

In mathematics, a polynomial is an expression consisting of indeterminates and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An example of a polynomial of a single indeterminate x is x2 − 4x + 7. An example with three indeterminates is x3 + 2xyz2yz + 1.

The fundamental theorem of algebra, also known as d'Alembert's theorem, or the d'Alembert–Gauss theorem, states that every non-constant single-variable polynomial with complex coefficients has at least one complex root. This includes polynomials with real coefficients, since every real number is a complex number with its imaginary part equal to zero.

In mathematics, the discriminant of a polynomial is a quantity that depends on the coefficients and allows deducing some properties of the roots without computing them. More precisely, it is a polynomial function of the coefficients of the original polynomial. The discriminant is widely used in polynomial factoring, number theory, and algebraic geometry.

<span class="mw-page-title-main">Cayley–Hamilton theorem</span> Every square matrix over a commutative ring satisfies its own characteristic equation

In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.

In mathematics, a quadratic form is a polynomial with terms all of degree two. For example,

In mathematics, in particular in algebraic topology, differential geometry and algebraic geometry, the Chern classes are characteristic classes associated with complex vector bundles. They have since become fundamental concepts in many branches of mathematics and physics, such as string theory, Chern–Simons theory, knot theory, Gromov-Witten invariants.

In algebra, a monic polynomial is a non-zero univariate polynomial in which the leading coefficient is equal to 1. That is to say, a monic polynomial is one that can be written as

<span class="mw-page-title-main">Lindemann–Weierstrass theorem</span> On algebraic independence of exponentials of linearly independent algebraic numbers over Q

In transcendental number theory, the Lindemann–Weierstrass theorem is a result that is very useful in establishing the transcendence of numbers. It states the following:

<span class="mw-page-title-main">Generalized hypergeometric function</span>

In mathematics, a generalized hypergeometric series is a power series in which the ratio of successive coefficients indexed by n is a rational function of n. The series, if convergent, defines a generalized hypergeometric function, which may then be defined over a wider domain of the argument by analytic continuation. The generalized hypergeometric series is sometimes just called the hypergeometric series, though this term also sometimes just refers to the Gaussian hypergeometric series. Generalized hypergeometric functions include the (Gaussian) hypergeometric function and the confluent hypergeometric function as special cases, which in turn have many particular special functions as special cases, such as elementary functions, Bessel functions, and the classical orthogonal polynomials.

In algebra and in particular in algebraic combinatorics, the ring of symmetric functions is a specific limit of the rings of symmetric polynomials in n indeterminates, as n goes to infinity. This ring serves as universal structure in which relations between symmetric polynomials can be expressed in a way independent of the number n of indeterminates. Among other things, this ring plays an important role in the representation theory of the symmetric group.

<span class="mw-page-title-main">Polynomial ring</span> Algebraic structure

In mathematics, especially in the field of algebra, a polynomial ring or polynomial algebra is a ring formed from the set of polynomials in one or more indeterminates with coefficients in another ring, often a field.

In combinatorial mathematics, the Bell polynomials, named in honor of Eric Temple Bell, are used in the study of set partitions. They are related to Stirling and Bell numbers. They also occur in many applications, such as in the Faà di Bruno's formula.

<span class="mw-page-title-main">Vieta's formulas</span> Relating coefficients and roots of a polynomial

In mathematics, Vieta's formulas relate the coefficients of a polynomial to sums and products of its roots. They are named after François Viète.

In mathematics, a symmetric polynomial is a polynomial P(X1, X2, …, Xn) in n variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, P is a symmetric polynomial if for any permutation σ of the subscripts 1, 2, ..., n one has P(Xσ(1), Xσ(2), …, Xσ(n)) = P(X1, X2, …, Xn).

In mathematics, specifically in commutative algebra, the elementary symmetric polynomials are one type of basic building block for symmetric polynomials, in the sense that any symmetric polynomial can be expressed as a polynomial in elementary symmetric polynomials. That is, any symmetric polynomial P is given by an expression involving only additions and multiplication of constants and elementary symmetric polynomials. There is one elementary symmetric polynomial of degree d in n variables for each positive integer dn, and it is formed by adding together all distinct products of d distinct variables.

In mathematics, the resultant of two polynomials is a polynomial expression of their coefficients, which is equal to zero if and only if the polynomials have a common root, or, equivalently, a common factor. In some older texts, the resultant is also called the eliminant.

In mathematics, Newton's identities, also known as the Girard–Newton formulae, give relations between two types of symmetric polynomials, namely between power sums and elementary symmetric polynomials. Evaluated at the roots of a monic polynomial P in one variable, they allow expressing the sums of the k-th powers of all roots of P in terms of the coefficients of P, without actually finding those roots. These identities were found by Isaac Newton around 1666, apparently in ignorance of earlier work (1629) by Albert Girard. They have applications in many areas of mathematics, including Galois theory, invariant theory, group theory, combinatorics, as well as further applications outside mathematics, including general relativity.

<span class="mw-page-title-main">Genus of a multiplicative sequence</span>

In mathematics, a genus of a multiplicative sequence is a ring homomorphism from the ring of smooth compact manifolds up to the equivalence of bounding a smooth manifold with boundary to another ring, usually the rational numbers, having the property that they are constructed from a sequence of polynomials in characteristic classes that arise as coefficients in formal power series with good multiplicative properties.

In mathematics, specifically in algebraic combinatorics and commutative algebra, the complete homogeneous symmetric polynomials are a specific kind of symmetric polynomials. Every symmetric polynomial can be expressed as a polynomial expression in complete homogeneous symmetric polynomials.

In mathematics, a multiplicative sequence or m-sequence is a sequence of polynomials associated with a formal group structure. They have application in the cobordism ring in algebraic topology.

References