Symmetric tensor

Last updated

In mathematics, a symmetric tensor is a tensor that is invariant under a permutation of its vector arguments:

Contents

for every permutation σ of the symbols {1, 2, ..., r}. Alternatively, a symmetric tensor of order r represented in coordinates as a quantity with r indices satisfies

The space of symmetric tensors of order r on a finite-dimensional vector space V is naturally isomorphic to the dual of the space of homogeneous polynomials of degree r on V. Over fields of characteristic zero, the graded vector space of all symmetric tensors can be naturally identified with the symmetric algebra on V. A related concept is that of the antisymmetric tensor or alternating form. Symmetric tensors occur widely in engineering, physics and mathematics.

Definition

Let V be a vector space and

a tensor of order k. Then T is a symmetric tensor if

for the braiding maps associated to every permutation σ on the symbols {1,2,...,k} (or equivalently for every transposition on these symbols).

Given a basis {ei} of V, any symmetric tensor T of rank k can be written as

for some unique list of coefficients (the components of the tensor in the basis) that are symmetric on the indices. That is to say

for every permutation σ.

The space of all symmetric tensors of order k defined on V is often denoted by Sk(V) or Symk(V). It is itself a vector space, and if V has dimension N then the dimension of Symk(V) is the binomial coefficient

We then construct Sym(V) as the direct sum of Symk(V) for k = 0,1,2,...

Examples

There are many examples of symmetric tensors. Some include, the metric tensor, , the Einstein tensor, and the Ricci tensor, .

Many material properties and fields used in physics and engineering can be represented as symmetric tensor fields; for example: stress, strain, and anisotropic conductivity. Also, in diffusion MRI one often uses symmetric tensors to describe diffusion in the brain or other parts of the body.

Ellipsoids are examples of algebraic varieties; and so, for general rank, symmetric tensors, in the guise of homogeneous polynomials, are used to define projective varieties, and are often studied as such.

Given a Riemannian manifold equipped with its Levi-Civita connection , the covariant curvature tensor is a symmetric order 2 tensor over the vector space of differential 2-forms. This corresponds to the fact that, viewing , we have the symmetry between the first and second pairs of arguments in addition to antisymmetry within each pair: . [1]

Symmetric part of a tensor

Suppose is a vector space over a field of characteristic 0. If TVk is a tensor of order , then the symmetric part of is the symmetric tensor defined by

the summation extending over the symmetric group on k symbols. In terms of a basis, and employing the Einstein summation convention, if

then

The components of the tensor appearing on the right are often denoted by

with parentheses () around the indices being symmetrized. Square brackets [] are used to indicate anti-symmetrization.

Symmetric product

If T is a simple tensor, given as a pure tensor product

then the symmetric part of T is the symmetric product of the factors:

In general we can turn Sym(V) into an algebra by defining the commutative and associative product ⊙. [2] Given two tensors T1 Symk1(V) and T2 Symk2(V), we use the symmetrization operator to define:

It can be verified (as is done by Kostrikin and Manin [2] ) that the resulting product is in fact commutative and associative. In some cases the operator is omitted: T1T2 = T1T2.

In some cases an exponential notation is used:

Where v is a vector. Again, in some cases the ⊙ is left out:

Decomposition

In analogy with the theory of symmetric matrices, a (real) symmetric tensor of order 2 can be "diagonalized". More precisely, for any tensor T  Sym2(V), there is an integer r, non-zero unit vectors v1,...,vr  V and weights λ1,...,λr such that

The minimum number r for which such a decomposition is possible is the (symmetric) rank of T. The vectors appearing in this minimal expression are the principal axes of the tensor, and generally have an important physical meaning. For example, the principal axes of the inertia tensor define the Poinsot's ellipsoid representing the moment of inertia. Also see Sylvester's law of inertia.

For symmetric tensors of arbitrary order k, decompositions

are also possible. The minimum number r for which such a decomposition is possible is the symmetric rank of T. [3] This minimal decomposition is called a Waring decomposition; it is a symmetric form of the tensor rank decomposition. For second-order tensors this corresponds to the rank of the matrix representing the tensor in any basis, and it is well known that the maximum rank is equal to the dimension of the underlying vector space. However, for higher orders this need not hold: the rank can be higher than the number of dimensions in the underlying vector space. Moreover, the rank and symmetric rank of a symmetric tensor may differ. [4]

See also

Notes

  1. Carmo, Manfredo Perdigão do (1992). Riemannian geometry. Francis J. Flaherty. Boston: Birkhäuser. ISBN   0-8176-3490-8. OCLC   24667701.
  2. 1 2 Kostrikin, Alexei I.; Manin, Iurii Ivanovich (1997). Linear algebra and geometry. Algebra, Logic and Applications. Vol. 1. Gordon and Breach. pp. 276–279. ISBN   9056990497.
  3. Comon, P.; Golub, G.; Lim, L. H.; Mourrain, B. (2008). "Symmetric Tensors and Symmetric Tensor Rank". SIAM Journal on Matrix Analysis and Applications. 30 (3): 1254. arXiv: 0802.1681 . doi:10.1137/060661569. S2CID   5676548.
  4. Shitov, Yaroslav (2018). "A Counterexample to Comon's Conjecture". SIAM Journal on Applied Algebra and Geometry. 2 (3): 428–443. arXiv: 1705.08740 . doi:10.1137/17m1131970. ISSN   2470-6566. S2CID   119717133.

Related Research Articles

In mathematics, an associative algebraA over a commutative ring K is a ring A together with a ring homomorphism from K into the center of A. This is thus an algebraic structure with an addition, a multiplication, and a scalar multiplication. The addition and multiplication operations together give A the structure of a ring; the addition and scalar multiplication operations together give A the structure of a module or vector space over K. In this article we will also use the term K-algebra to mean an associative algebra over K. A standard first example of a K-algebra is a ring of square matrices over a commutative ring K, with the usual matrix multiplication.

<span class="mw-page-title-main">Pauli matrices</span> Matrices important in quantum mechanics and the study of spin

In mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices that are Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma, they are occasionally denoted by tau when used in connection with isospin symmetries.

In mathematics, the tensor product of two vector spaces V and W is a vector space to which is associated a bilinear map that maps a pair to an element of denoted

In linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector. If the two coordinate vectors have dimensions n and m, then their outer product is an n × m matrix. More generally, given two tensors, their outer product is a tensor. The outer product of tensors is also referred to as their tensor product, and can be used to define the tensor algebra.

<span class="mw-page-title-main">Exterior algebra</span> Algebra of a vector space

In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogs. The exterior product of two vectors u and v, denoted by uv, is called a bivector and lives in a space called the exterior square, a vector space that is distinct from the original space of vectors. The magnitude of uv can be interpreted as the area of the parallelogram with sides u and v, which in three dimensions can also be computed using the cross product of the two vectors. Like the cross product, the exterior product is anticommutative, meaning that uv = −(vu) for all vectors u and v, but, unlike the cross product, the exterior product is associative. One way to visualize a bivector is as a family of parallelograms all lying in the same plane, having the same area and orientation, which is a choice of rotational direction within the plane (clockwise or counterclockwise from some view).

In mathematics, the modern component-free approach to the theory of a tensor views a tensor as an abstract object, expressing some definite type of multilinear concept. Their properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.

In mathematics, in particular in algebraic topology, differential geometry and algebraic geometry, the Chern classes are characteristic classes associated with complex vector bundles. They have since become fundamental concepts in many branches of mathematics and physics, such as string theory, Chern–Simons theory, knot theory, Gromov–Witten invariants. Chern classes were introduced by Shiing-Shen Chern (1946).

In mathematics, the tensor algebra of a vector space V, denoted T(V) or T(V), is the algebra of tensors on V (of any rank) with multiplication being the tensor product. It is the free algebra on V, in the sense of being left adjoint to the forgetful functor from algebras to vector spaces: it is the "most general" algebra containing V, in the sense of the corresponding universal property (see below).

In mathematics, and especially differential geometry and gauge theory, a connection on a fiber bundle is a device that defines a notion of parallel transport on the bundle; that is, a way to "connect" or identify fibers over nearby points. The most common case is that of a linear connection on a vector bundle, for which the notion of parallel transport must be linear. A linear connection is equivalently specified by a covariant derivative, an operator that differentiates sections of the bundle along tangent directions in the base manifold, in such a way that parallel sections have derivative zero. Linear connections generalize, to arbitrary vector bundles, the Levi-Civita connection on the tangent bundle of a pseudo-Riemannian manifold, which gives a standard way to differentiate vector fields. Nonlinear connections generalize this concept to bundles whose fibers are not necessarily linear.

In abstract algebra, the Weyl algebra is the ring of differential operators with polynomial coefficients, namely expressions of the form

In mathematics, the symmetric algebraS(V) (also denoted Sym(V)) on a vector space V over a field K is a commutative algebra over K that contains V, and is, in some sense, minimal for this property. Here, "minimal" means that S(V) satisfies the following universal property: for every linear map f from V to a commutative algebra A, there is a unique algebra homomorphism g : S(V) → A such that f = gi, where i is the inclusion map of V in S(V).

In abstract algebra and multilinear algebra, a multilinear form on a vector space over a field is a map

In mathematics, a Young symmetrizer is an element of the group algebra of the symmetric group, constructed in such a way that, for the homomorphism from the group algebra to the endomorphisms of a vector space obtained from the action of on by permutation of indices, the image of the endomorphism determined by that element corresponds to an irreducible representation of the symmetric group over the complex numbers. A similar construction works over any field, and the resulting representations are called Specht modules. The Young symmetrizer is named after British mathematician Alfred Young.

In mathematics, Schubert calculus is a branch of algebraic geometry introduced in the nineteenth century by Hermann Schubert in order to solve various counting problems of projective geometry and, as such, is viewed as part of enumerative geometry. Giving it a more rigorous foundation was the aim of Hilbert's 15th problem. It is related to several more modern concepts, such as characteristic classes, and both its algorithmic aspects and applications remain of current interest. The term Schubert calculus is sometimes used to mean the enumerative geometry of linear subspaces of a vector space, which is roughly equivalent to describing the cohomology ring of Grassmannians. Sometimes it is used to mean the more general enumerative geometry of algebraic varieties that are homogenous spaces of simple Lie groups. Even more generally, Schubert calculus is sometimes understood as encompassing the study of analogous questions in generalized cohomology theories.

In mathematics, a Hirzebruch surface is a ruled surface over the projective line. They were studied by Friedrich Hirzebruch (1951).

In mathematics, especially in the field of representation theory, Schur functors are certain functors from the category of modules over a fixed commutative ring to itself. They generalize the constructions of exterior powers and symmetric powers of a vector space. Schur functors are indexed by Young diagrams in such a way that the horizontal diagram with n cells corresponds to the nth symmetric power functor, and the vertical diagram with n cells corresponds to the nth exterior power functor. If a vector space V is a representation of a group G, then also has a natural action of G for any Schur functor .

Schur–Weyl duality is a mathematical theorem in representation theory that relates irreducible finite-dimensional representations of the general linear and symmetric groups. It is named after two pioneers of representation theory of Lie groups, Issai Schur, who discovered the phenomenon, and Hermann Weyl, who popularized it in his books on quantum mechanics and classical groups as a way of classifying representations of unitary and general linear groups.

In mathematics, the tensor product of representations is a tensor product of vector spaces underlying representations together with the factor-wise group action on the product. This construction, together with the Clebsch–Gordan procedure, can be used to generate additional irreducible representations if one already knows a few.

This is a glossary of representation theory in mathematics.

Gamas's theorem is a result in multilinear algebra which states the necessary and sufficient conditions for a tensor symmetrized by an irreducible representation of the symmetric group to be zero. It was proven in 1988 by Carlos Gamas. Additional proofs have been given by Pate and Berget.

References