Tensor (intrinsic definition)

Last updated

In mathematics, the modern component-free approach to the theory of a tensor views a tensor as an abstract object, expressing some definite type of multilinear concept. Their properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.

Contents

In differential geometry, an intrinsic[ definition needed ] geometric statement may be described by a tensor field on a manifold, and then doesn't need to make reference to coordinates at all. The same is true in general relativity, of tensor fields describing a physical property. The component-free approach is also used extensively in abstract algebra and homological algebra, where tensors arise naturally.

Definition via tensor products of vector spaces

Given a finite set {V1, ..., Vn} of vector spaces over a common field F, one may form their tensor product V1 ⊗ ... ⊗ Vn, an element of which is termed a tensor.

A tensor on the vector spaceV is then defined to be an element of (i.e., a vector in) a vector space of the form: where V is the dual space of V.

If there are m copies of V and n copies of V in our product, the tensor is said to be of type (m, n) and contravariant of order m and covariant of order n and of total order m + n. The tensors of order zero are just the scalars (elements of the field F), those of contravariant order 1 are the vectors in V, and those of covariant order 1 are the one-forms in V (for this reason, the elements of the last two spaces are often called the contravariant and covariant vectors). The space of all tensors of type (m, n) is denoted

Example 1. The space of type (1, 1) tensors, is isomorphic in a natural way to the space of linear transformations from V to V.

Example 2. A bilinear form on a real vector space V, corresponds in a natural way to a type (0, 2) tensor in An example of such a bilinear form may be defined,[ clarification needed ] termed the associated metric tensor , and is usually denoted g.

Tensor rank

A simple tensor (also called a tensor of rank one, elementary tensor or decomposable tensor [1] ) is a tensor that can be written as a product of tensors of the form where a, b, ..., d are nonzero and in V or V – that is, if the tensor is nonzero and completely factorizable. Every tensor can be expressed as a sum of simple tensors. The rank of a tensorT is the minimum number of simple tensors that sum to T. [2]

The zero tensor has rank zero. A nonzero order 0 or 1 tensor always has rank 1. The rank of a non-zero order 2 or higher tensor is less than or equal to the product of the dimensions of all but the highest-dimensioned vectors in (a sum of products of) which the tensor can be expressed, which is dn−1 when each product is of n vectors from a finite-dimensional vector space of dimension d.

The term rank of a tensor extends the notion of the rank of a matrix in linear algebra, although the term is also often used to mean the order (or degree) of a tensor. The rank of a matrix is the minimum number of column vectors needed to span the range of the matrix. A matrix thus has rank one if it can be written as an outer product of two nonzero vectors:

The rank of a matrix A is the smallest number of such outer products that can be summed to produce it:

In indices, a tensor of rank 1 is a tensor of the form

The rank of a tensor of order 2 agrees with the rank when the tensor is regarded as a matrix, [3] and can be determined from Gaussian elimination for instance. The rank of an order 3 or higher tensor is however often very difficult to determine, and low rank decompositions of tensors are sometimes of great practical interest. [4] In fact, the problem of finding the rank of an order 3 tensor over any finite field is NP-Complete, and over the rationals, is NP-Hard. [5] Computational tasks such as the efficient multiplication of matrices and the efficient evaluation of polynomials can be recast as the problem of simultaneously evaluating a set of bilinear forms for given inputs xi and yj. If a low-rank decomposition of the tensor T is known, then an efficient evaluation strategy is known. [6]

Universal property

The space can be characterized by a universal property in terms of multilinear mappings. Amongst the advantages of this approach are that it gives a way to show that many linear mappings are "natural" or "geometric" (in other words are independent of any choice of basis). Explicit computational information can then be written down using bases, and this order of priorities can be more convenient than proving a formula gives rise to a natural mapping. Another aspect is that tensor products are not used only for free modules, and the "universal" approach carries over more easily to more general situations.

A scalar-valued function on a Cartesian product (or direct sum) of vector spaces is multilinear if it is linear in each argument. The space of all multilinear mappings from V1 × ... × VN to W is denoted LN(V1, ..., VN; W). When N = 1, a multilinear mapping is just an ordinary linear mapping, and the space of all linear mappings from V to W is denoted L(V; W).

The universal characterization of the tensor product implies that, for each multilinear function (where W can represent the field of scalars, a vector space, or a tensor space) there exists a unique linear function such that for all vi in V and αi in V.

Using the universal property, it follows, when V is finite dimensional, that the space of (m, n)-tensors admits a natural isomorphism

Each V in the definition of the tensor corresponds to a V inside the argument of the linear maps, and vice versa. (Note that in the former case, there are m copies of V and n copies of V, and in the latter case vice versa). In particular, one has

Tensor fields

Differential geometry, physics and engineering must often deal with tensor fields on smooth manifolds. The term tensor is sometimes used as a shorthand for tensor field. A tensor field expresses the concept of a tensor that varies from point to point on the manifold.

Related Research Articles

In mathematics, a product is the result of multiplication, or an expression that identifies objects to be multiplied, called factors. For example, 21 is the product of 3 and 7, and is the product of and . When one factor is an integer, the product is called a multiple.

<span class="mw-page-title-main">Tensor</span> Algebraic object with geometric applications

In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects related to a vector space. Tensors may map between different objects such as vectors, scalars, and even other tensors. There are many types of tensors, including scalars and vectors, dual vectors, multilinear maps between vector spaces, and even some operations such as the dot product. Tensors are defined independent of any basis, although they are often referred to by their components in a basis related to a particular coordinate system; those components form an array, which can be thought of as a high-dimensional matrix.

In mathematics, the tensor product of two vector spaces V and W is a vector space to which is associated a bilinear map that maps a pair to an element of denoted .

In multilinear algebra, a tensor contraction is an operation on a tensor that arises from the canonical pairing of a vector space and its dual. In components, it is expressed as a sum of products of scalar components of the tensor(s) caused by applying the summation convention to a pair of dummy indices that are bound to each other in an expression. The contraction of a single mixed tensor occurs when a pair of literal indices of the tensor are set equal to each other and summed over. In Einstein notation this summation is built into the notation. The result is another tensor with order reduced by 2.

In linear algebra, a multilinear map is a function of several variables that is linear separately in each variable. More precisely, a multilinear map is a function

<span class="mw-page-title-main">Exterior algebra</span> Algebra of exterior/ wedge products

In mathematics, the exterior algebra or Grassmann algebra of a vector space is an associative algebra that contains which has a product, called exterior product or wedge product and denoted with , such that for every vector in The exterior algebra is named after Hermann Grassmann, and the names of the product come from the "wedge" symbol and the fact that the product of two elements of is "outside"

In mathematics and physics, a tensor field is a function assigning a tensor to each point of a region of a mathematical space or of the physical space. Tensor fields are used in differential geometry, algebraic geometry, general relativity, in the analysis of stress and strain in material object, and in numerous applications in the physical sciences. As a tensor is a generalization of a scalar and a vector, a tensor field is a generalization of a scalar field and a vector field that assigns, respectively, a scalar or vector to each point of space. If a tensor A is defined on a vector fields set X(M) over a module M, we call A a tensor field on M. Many mathematical structures called "tensors" are also tensor fields. For example, the Riemann curvature tensor is a tensor field as it associates a tensor to each point of a Riemannian manifold, which is a topological space.

In mathematics, the universal enveloping algebra of a Lie algebra is the unital associative algebra whose representations correspond precisely to the representations of that Lie algebra.

In mathematics, the tensor algebra of a vector space V, denoted T(V) or T(V), is the algebra of tensors on V (of any rank) with multiplication being the tensor product. It is the free algebra on V, in the sense of being left adjoint to the forgetful functor from algebras to vector spaces: it is the "most general" algebra containing V, in the sense of the corresponding universal property (see below).

Let be a smooth map between smooth manifolds and . Then there is an associated linear map from the space of 1-forms on to the space of 1-forms on . This linear map is known as the pullback, and is frequently denoted by . More generally, any covariant tensor field – in particular any differential form – on may be pulled back to using .

In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a specialization of the tensor product from vectors to matrices and gives the matrix of the tensor product linear map with respect to a standard choice of basis. The Kronecker product is to be distinguished from the usual matrix multiplication, which is an entirely different operation. The Kronecker product is also sometimes called matrix direct product.

In abstract algebra and multilinear algebra, a multilinear form on a vector space over a field is a map

In mathematics, and in particular functional analysis, the tensor product of Hilbert spaces is a way to extend the tensor product construction so that the result of taking a tensor product of two Hilbert spaces is another Hilbert space. Roughly speaking, the tensor product is the metric space completion of the ordinary tensor product. This is an example of a topological tensor product. The tensor product allows Hilbert spaces to be collected into a symmetric monoidal category.

In mathematics, the tensor product of modules is a construction that allows arguments about bilinear maps to be carried out in terms of linear maps. The module construction is analogous to the construction of the tensor product of vector spaces, but can be carried out for a pair of modules over a commutative ring resulting in a third module, and also for a pair of a right-module and a left-module over any ring, with result an abelian group. Tensor products are important in areas of abstract algebra, homological algebra, algebraic topology, algebraic geometry, operator algebras and noncommutative geometry. The universal property of the tensor product of vector spaces extends to more general situations in abstract algebra. The tensor product of an algebra and a module can be used for extension of scalars. For a commutative ring, the tensor product of modules can be iterated to form the tensor algebra of a module, allowing one to define multiplication in the module in a universal way.

In multilinear algebra, the tensor rank decomposition or rank-R decomposition is the decomposition of a tensor as a sum of R rank-1 tensors, where R is minimal. Computing this decomposition is an open problem.

In multilinear algebra, a reshaping of tensors is any bijection between the set of indices of an order- tensor and the set of indices of an order- tensor, where . The use of indices presupposes tensors in coordinate representation with respect to a basis. The coordinate representation of a tensor can be regarded as a multi-dimensional array, and a bijection from one set of indices to another therefore amounts to a rearrangement of the array elements into an array of a different shape. Such a rearrangement constitutes a particular kind of linear map between the vector space of order- tensors and the vector space of order- tensors.

In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one type of generalization of the matrix singular value decomposition. It has applications in computer vision, computer graphics, machine learning, scientific computing, and signal processing. Some aspects can be traced as far back as F. L. Hitchcock in 1928, but it was L. R. Tucker who developed for third-order tensors the general Tucker decomposition in the 1960s, further advocated by L. De Lathauwer et al. in their Multilinear SVD work that employs the power method, or advocated by Vasilescu and Terzopoulos that developed M-mode SVD a parallel algorithm that employs the matrix SVD.

In mathematics, a symmetric tensor is a tensor that is invariant under a permutation of its vector arguments:

In mathematics, the tensor product of representations is a tensor product of vector spaces underlying representations together with the factor-wise group action on the product. This construction, together with the Clebsch–Gordan procedure, can be used to generate additional irreducible representations if one already knows a few.

In multilinear algebra, applying a map that is the tensor product of linear maps to a tensor is called a multilinear multiplication.

References