Symmetric tensor

Last updated

In mathematics, a symmetric tensor is a tensor that is invariant under a permutation of its vector arguments:

Contents

${\displaystyle T(v_{1},v_{2},\ldots ,v_{r})=T(v_{\sigma 1},v_{\sigma 2},\ldots ,v_{\sigma r})}$

for every permutation σ of the symbols {1, 2, ..., r}. Alternatively, a symmetric tensor of order r represented in coordinates as a quantity with r indices satisfies

${\displaystyle T_{i_{1}i_{2}\cdots i_{r}}=T_{i_{\sigma 1}i_{\sigma 2}\cdots i_{\sigma r}}.}$

The space of symmetric tensors of order r on a finite-dimensional vector space V is naturally isomorphic to the dual of the space of homogeneous polynomials of degree r on V. Over fields of characteristic zero, the graded vector space of all symmetric tensors can be naturally identified with the symmetric algebra on V. A related concept is that of the antisymmetric tensor or alternating form. Symmetric tensors occur widely in engineering, physics and mathematics.

Definition

Let V be a vector space and

${\displaystyle T\in V^{\otimes k}}$

a tensor of order k. Then T is a symmetric tensor if

${\displaystyle \tau _{\sigma }T=T\,}$

for the braiding maps associated to every permutation σ on the symbols {1,2,...,k} (or equivalently for every transposition on these symbols).

Given a basis {ei} of V, any symmetric tensor T of rank k can be written as

${\displaystyle T=\sum _{i_{1},\ldots ,i_{k}=1}^{N}T_{i_{1}i_{2}\cdots i_{k}}e^{i_{1}}\otimes e^{i_{2}}\otimes \cdots \otimes e^{i_{k}}}$

for some unique list of coefficients ${\displaystyle T_{i_{1}i_{2}\cdots i_{k}}}$ (the components of the tensor in the basis) that are symmetric on the indices. That is to say

${\displaystyle T_{i_{\sigma 1}i_{\sigma 2}\cdots i_{\sigma k}}=T_{i_{1}i_{2}\cdots i_{k}}}$

for every permutation σ.

The space of all symmetric tensors of order k defined on V is often denoted by Sk(V) or Symk(V). It is itself a vector space, and if V has dimension N then the dimension of Symk(V) is the binomial coefficient

${\displaystyle \dim \operatorname {Sym} ^{k}(V)={N+k-1 \choose k}.}$

We then construct Sym(V) as the direct sum of Symk(V) for k = 0,1,2,...

${\displaystyle \operatorname {Sym} (V)=\bigoplus _{k=0}^{\infty }\operatorname {Sym} ^{k}(V).}$

Examples

There are many examples of symmetric tensors. Some include, the metric tensor, ${\displaystyle g_{\mu \nu }}$, the Einstein tensor, ${\displaystyle G_{\mu \nu }}$ and the Ricci tensor, ${\displaystyle R_{\mu \nu }}$.

Many material properties and fields used in physics and engineering can be represented as symmetric tensor fields; for example: stress, strain, and anisotropic conductivity. Also, in diffusion MRI one often uses symmetric tensors to describe diffusion in the brain or other parts of the body.

Ellipsoids are examples of algebraic varieties; and so, for general rank, symmetric tensors, in the guise of homogeneous polynomials, are used to define projective varieties, and are often studied as such.

Symmetric part of a tensor

Suppose ${\displaystyle V}$ is a vector space over a field of characteristic 0. If TVk is a tensor of order ${\displaystyle k}$, then the symmetric part of ${\displaystyle T}$ is the symmetric tensor defined by

${\displaystyle \operatorname {Sym} \,T={\frac {1}{k!}}\sum _{\sigma \in {\mathfrak {S}}_{k}}\tau _{\sigma }T,}$

the summation extending over the symmetric group on k symbols. In terms of a basis, and employing the Einstein summation convention, if

${\displaystyle T=T_{i_{1}i_{2}\cdots i_{k}}e^{i_{1}}\otimes e^{i_{2}}\otimes \cdots \otimes e^{i_{k}},}$

then

${\displaystyle \operatorname {Sym} \,T={\frac {1}{k!}}\sum _{\sigma \in {\mathfrak {S}}_{k}}T_{i_{\sigma 1}i_{\sigma 2}\cdots i_{\sigma k}}e^{i_{1}}\otimes e^{i_{2}}\otimes \cdots \otimes e^{i_{k}}.}$

The components of the tensor appearing on the right are often denoted by

${\displaystyle T_{(i_{1}i_{2}\cdots i_{k})}={\frac {1}{k!}}\sum _{\sigma \in {\mathfrak {S}}_{k}}T_{i_{\sigma 1}i_{\sigma 2}\cdots i_{\sigma k}}}$

with parentheses () around the indices being symmetrized. Square brackets [] are used to indicate anti-symmetrization.

Symmetric product

If T is a simple tensor, given as a pure tensor product

${\displaystyle T=v_{1}\otimes v_{2}\otimes \cdots \otimes v_{r}}$

then the symmetric part of T is the symmetric product of the factors:

${\displaystyle v_{1}\odot v_{2}\odot \cdots \odot v_{r}:={\frac {1}{r!}}\sum _{\sigma \in {\mathfrak {S}}_{r}}v_{\sigma 1}\otimes v_{\sigma 2}\otimes \cdots \otimes v_{\sigma r}.}$

In general we can turn Sym(V) into an algebra by defining the commutative and associative product ⊙. [1] Given two tensors T1 Symk1(V) and T2 Symk2(V), we use the symmetrization operator to define:

${\displaystyle T_{1}\odot T_{2}=\operatorname {Sym} (T_{1}\otimes T_{2})\quad \left(\in \operatorname {Sym} ^{k_{1}+k_{2}}(V)\right).}$

It can be verified (as is done by Kostrikin and Manin [1] ) that the resulting product is in fact commutative and associative. In some cases the operator is omitted: T1T2 = T1T2.

In some cases an exponential notation is used:

${\displaystyle v^{\odot k}=\underbrace {v\odot v\odot \cdots \odot v} _{k{\text{ times}}}=\underbrace {v\otimes v\otimes \cdots \otimes v} _{k{\text{ times}}}=v^{\otimes k}.}$

Where v is a vector. Again, in some cases the ⊙ is left out:

${\displaystyle v^{k}=\underbrace {v\,v\,\cdots \,v} _{k{\text{ times}}}=\underbrace {v\odot v\odot \cdots \odot v} _{k{\text{ times}}}.}$

Decomposition

In analogy with the theory of symmetric matrices, a (real) symmetric tensor of order 2 can be "diagonalized". More precisely, for any tensor T  Sym2(V), there are an integer r, non-zero unit vectors v1,...,vr  V and weights λ1,...,λr such that

${\displaystyle T=\sum _{i=1}^{r}\lambda _{i}\,v_{i}\otimes v_{i}.}$

The minimum number r for which such a decomposition is possible is the (symmetric) rank of T. The vectors appearing in this minimal expression are the principal axes of the tensor, and generally have an important physical meaning. For example, the principal axes of the inertia tensor define the Poinsot's ellipsoid representing the moment of inertia. Also see Sylvester's law of inertia.

For symmetric tensors of arbitrary order k, decompositions

${\displaystyle T=\sum _{i=1}^{r}\lambda _{i}\,v_{i}^{\otimes k}}$

are also possible. The minimum number r for which such a decomposition is possible is the symmetric rank of T. [2] This minimal decomposition is called a Waring decomposition; it is a symmetric form of the tensor rank decomposition. For second-order tensors this corresponds to the rank of the matrix representing the tensor in any basis, and it is well known that the maximum rank is equal to the dimension of the underlying vector space. However, for higher orders this need not hold: the rank can be higher than the number of dimensions in the underlying vector space. Moreover, the rank and symmetric rank of a symmetric tensor may differ. [3]

Notes

1. Kostrikin, Alexei I.; Manin, Iurii Ivanovich (1997). Linear algebra and geometry. Algebra, Logic and Applications. 1. Gordon and Breach. pp. 276–279. ISBN   9056990497.
2. Comon, P.; Golub, G.; Lim, L. H.; Mourrain, B. (2008). "Symmetric Tensors and Symmetric Tensor Rank". SIAM Journal on Matrix Analysis and Applications. 30 (3): 1254. arXiv:. doi:10.1137/060661569.
3. Shitov, Yaroslav (2018). "A Counterexample to Comon's Conjecture". SIAM Journal on Applied Algebra and Geometry. 2 (3): 428–443. arXiv:. doi:10.1137/17m1131970. ISSN   2470-6566.

Related Research Articles

In mathematics, an associative algebra is an algebraic structure with compatible operations of addition, multiplication, and a scalar multiplication by elements in some field. The addition and multiplication operations together give A the structure of a ring; the addition and scalar multiplication operations together give A the structure of a vector space over K. In this article we will also use the term K-algebra to mean an associative algebra over the field K. A standard first example of a K-algebra is a ring of square matrices over a field K, with the usual matrix multiplication.

In mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices which are Hermitian and unitary. Usually indicated by the Greek letter sigma, they are occasionally denoted by tau when used in connection with isospin symmetries. They are

In mathematics, the tensor productVW of two vector spaces V and W is a vector space, endowed with a bilinear map from the Cartesian product V × W to VW. This bilinear map is universal in the sense that, for every vector space X, the bilinear maps from V × W to X are in one to one correspondence with the linear maps from VW to X.

In the mathematical field of differential geometry, the Riemann curvature tensor or Riemann–Christoffel tensor is the most common way used to express the curvature of Riemannian manifolds. It assigns a tensor to each point of a Riemannian manifold, that measures the extent to which the metric tensor is not locally isometric to that of Euclidean space. The curvature tensor can also be defined for any pseudo-Riemannian manifold, or indeed any manifold equipped with an affine connection.

In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition

In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors and , denoted by , is called a bivector and lives in a space called the exterior square, a vector space that is distinct from the original space of vectors. The magnitude of can be interpreted as the area of the parallelogram with sides and , which in three dimensions can also be computed using the cross product of the two vectors. More generally, all parallel plane surfaces with the same orientation and area have the same bivector as a measure of their oriented area. Like the cross product, the exterior product is anticommutative, meaning that for all vectors and , but, unlike the cross product, the exterior product is associative.

In mathematics, the modern component-free approach to the theory of a tensor views a tensor as an abstract object, expressing some definite type of multilinear concept. Their well-known properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.

In mathematics, in particular in algebraic topology, differential geometry and algebraic geometry, the Chern classes are characteristic classes associated with complex vector bundles. They have since found applications in physics, Calabi–Yau manifolds, string theory, Chern–Simons theory, knot theory, Gromov–Witten invariants, topological quantum field theory, the Chern theorem etc.

In mathematics, a universal enveloping algebra is the most general algebra that contains all representations of a Lie algebra.

In mathematics, the tensor algebra of a vector space V, denoted T(V) or T(V), is the algebra of tensors on V with multiplication being the tensor product. It is the free algebra on V, in the sense of being left adjoint to the forgetful functor from algebras to vector spaces: it is the "most general" algebra containing V, in the sense of the corresponding universal property.

In abstract algebra, the Weyl algebra is the ring of differential operators with polynomial coefficients, namely expressions of the form

In mathematics, the symmetric algebraS(V) on a vector space V over a field K is a commutative algebra over K that contains V, and is, in some sense, minimal for this property. Here, "minimal" means that S(V) satisfies the following universal property: for every linear map f from V to a commutative algebra A, there is a unique algebra homomorphism g : S(V) → A such that f = gi, where i is the inclusion map of V in S(V).

In mathematics, the exterior covariant derivative is an analog of an exterior derivative that takes into account the presence of a connection.

In mathematics, a Young symmetrizer is an element of the group algebra of the symmetric group, constructed in such a way that, for the homomorphism from the group algebra to the endomorphisms of a vector space obtained from the action of on by permutation of indices, the image of the endomorphism determined by that element corresponds to an irreducible representation of the symmetric group over the complex numbers. A similar construction works over any field, and the resulting representations are called Specht modules. The Young symmetrizer is named after British mathematician Alfred Young.

In mathematics, a Hirzebruch surface is a ruled surface over the projective line. They were studied by Friedrich Hirzebruch (1951).

In mathematics, especially in the field of representation theory, Schur functors are certain functors from the category of modules over a fixed commutative ring to itself. They generalize the constructions of exterior powers and symmetric powers of a vector space. Schur functors are indexed by Young diagrams in such a way that the horizontal diagram with n cells corresponds to the nth exterior power functor, and the vertical diagram with n cells corresponds to the nth symmetric power functor. If a vector space V is a representation of a group G, then also has a natural action of G for any Schur functor .

Schur–Weyl duality is a mathematical theorem in representation theory that relates irreducible finite-dimensional representations of the general linear and symmetric groups. It is named after two pioneers of representation theory of Lie groups, Issai Schur, who discovered the phenomenon, and Hermann Weyl, who popularized it in his books on quantum mechanics and classical groups as a way of classifying representations of unitary and general linear groups.

In multilinear algebra, the tensor rank decomposition or canonical polyadic decomposition (CPD) is one generalization of the matrix singular value decomposition (SVD) to tensors, which have found application in statistics, signal processing, computer vision, computer graphics, psychometrics, linguistics and chemometrics. The tensor rank decomposition was introduced by Hitchcock in 1927 and later rediscovered several times, notably in psychometrics. For this reason, the tensor rank decomposition is often referred to as CANDECOMP, PARAFAC, or CANDECOMP/PARAFAC (CP).

In mathematics, the tensor product of representations is a tensor product of vector spaces underlying representations together with the factor-wise group action on the product. This construction, together with the Clebsch–Gordan procedure, can be used to generate additional irreducible representations if one already knows a few.

This is a glossary of representation theory in mathematics.