Glossary of linear algebra

Last updated

This is a glossary of linear algebra.

Contents

See also: glossary of module theory.

A

Affine transformation
A composition of functions consisting of a linear transformation between vector spaces followed by a translation. [1] Equivalently, a function between vector spaces that preserves affine combinations.
Affine combination
A linear combination in which the sum of the coefficients is 1.

B

Basis
In a vector space, a linearly independent set of vectors spanning the whole vector space. [2]
Basis vector
An element of a given basis of a vector space. [2]

C

Column vector
A matrix with only one column. [3]
Coordinate vector
The tuple of the coordinates of a vector on a basis.
Covector
An element of the dual space of a vector space, (that is a linear form), identified to an element of the vector space through an inner product.

D

Determinant
The unique scalar function over square matrices which is distributive over matrix multiplication, multilinear in the rows and columns, and takes the value of for the unit matrix.
Diagonal matrix
A matrix in which only the entries on the main diagonal are non-zero. [4]
Dimension
The number of elements of any basis of a vector space. [2]
Dual space
The vector space of all linear forms on a given vector space. [5]

E

Elementary matrix
Square matrix that differs from the identity matrix by at most one entry

I

Identity matrix
A diagonal matrix all of the diagonal elements of which are equal to . [4]
Inverse matrix
Of a matrix , another matrix such that multiplied by and multiplied by both equal the identity matrix. [4]
Isotropic vector
In a vector space with a quadratic form, a non-zero vector for which the form is zero.
Isotropic quadratic form
A vector space with a quadratic form which has a null vector.

L

Linear algebra
The branch of mathematics that deals with vectors, vector spaces, linear transformations and systems of linear equations.
Linear combination
A sum, each of whose summands is an appropriate vector times an appropriate scalar (or ring element). [6]
Linear dependence
A linear dependence of a tuple of vectors is a nonzero tuple of scalar coefficients for which the linear combination equals .
Linear equation
A polynomial equation of degree one (such as ). [7]
Linear form
A linear map from a vector space to its field of scalars [8]
Linear independence
Property of being not linearly dependent. [9]
Linear map
A function between vector spaces which respects addition and scalar multiplication.
Linear transformation
A linear map whose domain and codomain are equal; it is generally supposed to be invertible.

M

Matrix
Rectangular arrangement of numbers or other mathematical objects. [4]

N

Null vector
1.  Another term for an isotropic vector.
2.  Another term for a zero vector.

R

Row vector
A matrix with only one row. [4]

S

Singular-value decomposition
a factorization of an complex matrix M as , where U is an complex unitary matrix, is an rectangular diagonal matrix with non-negative real numbers on the diagonal, and V is an complex unitary matrix. [10]
Spectrum
Set of the eigenvalues of a matrix. [11]
Square matrix
A matrix having the same number of rows as columns. [4]

U

Unit vector
a vector in a normed vector space whose norm is 1, or a Euclidean vector of length one. [12]

V

Vector
1.  A directed quantity, one with both magnitude and direction.
2.  An element of a vector space. [13]
Vector space
A set, whose elements can be added together, and multiplied by elements of a field (this is scalar multiplication); the set must be an abelian group under addition, and the scalar multiplication must be a linear map. [14]

Z

Zero vector
The additive identity in a vector space. In a normed vector space, it is the unique vector of norm zero. In a Euclidean vector space, it is the unique vector of length zero. [15]

Notes

    Related Research Articles

    In mathematics, and more specifically in linear algebra, a linear map is a mapping between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism.

    <span class="mw-page-title-main">Basis (linear algebra)</span> Set of vectors used to define coordinates

    In mathematics, a set B of vectors in a vector space V is called a basis if every element of V may be written in a unique way as a finite linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to B. The elements of a basis are called basis vectors.

    <span class="mw-page-title-main">Linear algebra</span> Branch of mathematics

    Linear algebra is the branch of mathematics concerning linear equations such as:

    In mathematics, an operator is generally a mapping or function that acts on elements of a space to produce elements of another space. There is no general definition of an operator, but the term is often used in place of function when the domain is a set of functions or other structured objects. Also, the domain of an operator is often difficult to characterize explicitly, and may be extended so as to act on related objects. See Operator (physics) for other examples.

    <span class="mw-page-title-main">Vector space</span> Algebraic structure in linear algebra

    In mathematics and physics, a vector space is a set whose elements, often called vectors, may be added together and multiplied ("scaled") by numbers called scalars. Scalars are often real numbers, but can be complex numbers or, more generally, elements of any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called vector axioms. The terms real vector space and complex vector space are often used to specify the nature of the scalars: real coordinate space or complex coordinate space.

    <span class="mw-page-title-main">Affine transformation</span> Geometric transformation that preserves lines but not angles nor the origin

    In Euclidean geometry, an affine transformation or affinity is a geometric transformation that preserves lines and parallelism, but not necessarily Euclidean distances and angles.

    <span class="mw-page-title-main">Quaternion</span> Noncommutative extension of the complex numbers

    In mathematics, the quaternion number system extends the complex numbers. Quaternions were first described by the Irish mathematician William Rowan Hamilton in 1843 and applied to mechanics in three-dimensional space. Hamilton defined a quaternion as the quotient of two directed lines in a three-dimensional space, or, equivalently, as the quotient of two vectors. Multiplication of quaternions is noncommutative.

    <span class="mw-page-title-main">Linear independence</span> Vectors whose linear combinations are nonzero

    In the theory of vector spaces, a set of vectors is said to be linearly independent if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be linearly dependent. These concepts are central to the definition of dimension.

    <span class="mw-page-title-main">General linear group</span> Group of n×n invertible matrices

    In mathematics, the general linear group of degree n is the set of n×n invertible matrices, together with the operation of ordinary matrix multiplication. This forms a group, because the product of two invertible matrices is again invertible, and the inverse of an invertible matrix is invertible, with the identity matrix as the identity element of the group. The group is so named because the columns of an invertible matrix are linearly independent, hence the vectors/points they define are in general linear position, and matrices in the general linear group take points in general linear position to points in general linear position.

    <span class="mw-page-title-main">Orthogonal group</span> Type of group in mathematics

    In mathematics, the orthogonal group in dimension , denoted , is the group of distance-preserving transformations of a Euclidean space of dimension that preserve a fixed point, where the group operation is given by composing transformations. The orthogonal group is sometimes called the general orthogonal group, by analogy with the general linear group. Equivalently, it is the group of orthogonal matrices, where the group operation is given by matrix multiplication. The orthogonal group is an algebraic group and a Lie group. It is compact.

    In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it, is a diagonal matrix.

    In mathematics, an algebra over a field is a vector space equipped with a bilinear product. Thus, an algebra is an algebraic structure consisting of a set together with operations of multiplication and addition and scalar multiplication by elements of a field and satisfying the axioms implied by "vector space" and "bilinear".

    In mathematics, and in particular linear algebra, the Moore–Penrose inverse of a matrix is the most widely known generalization of the inverse matrix. It was independently described by E. H. Moore in 1920, Arne Bjerhammar in 1951, and Roger Penrose in 1955. Earlier, Erik Ivar Fredholm had introduced the concept of a pseudoinverse of integral operators in 1903. When referring to a matrix, the term pseudoinverse, without further specification, is often used to indicate the Moore–Penrose inverse. The term generalized inverse is sometimes used as a synonym for pseudoinverse.

    In linear algebra, the quotient of a vector space by a subspace is a vector space obtained by "collapsing" to zero. The space obtained is called a quotient space and is denoted .

    In abstract algebra, the split-quaternions or coquaternions form an algebraic structure introduced by James Cockle in 1849 under the latter name. They form an associative algebra of dimension four over the real numbers.

    <span class="mw-page-title-main">Real coordinate space</span> Space formed by the n-tuples of real numbers

    In mathematics, the real coordinate space of dimension n, denoted Rn or , is the set of the n-tuples of real numbers, that is the set of all sequences of n real numbers. Special cases are called the real lineR1 and the real coordinate planeR2. With component-wise addition and scalar multiplication, it is a real vector space, and its elements are called coordinate vectors.

    In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor.

    <span class="mw-page-title-main">Vectorization (mathematics)</span> Conversion of a matrix or a tensor to a vector

    In mathematics, especially in linear algebra and matrix theory, the vectorization of a matrix is a linear transformation which converts the matrix into a vector. Specifically, the vectorization of a m × n matrix A, denoted vec(A), is the mn × 1 column vector obtained by stacking the columns of the matrix A on top of one another:

    In mathematics, a definite quadratic form is a quadratic form over some real vector space V that has the same sign for every non-zero vector of V. According to that sign, the quadratic form is called positive-definite or negative-definite.

    <span class="mw-page-title-main">Matrix (mathematics)</span> Array of numbers

    In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

    References