List of named matrices

Last updated

Several important classes of matrices are subsets of each other. Taxonomy of Complex Matrices.svg
Several important classes of matrices are subsets of each other.

This article lists some important classes of matrices used in mathematics, science and engineering. A matrix (plural matrices, or less commonly matrixes) is a rectangular array of numbers called entries. Matrices have a long history of both study and application, leading to diverse ways of classifying matrices. A first group is matrices satisfying concrete conditions of the entries, including constant matrices. Important examples include the identity matrix given by

Contents

and the zero matrix of dimension . For example:

.

Further ways of classifying matrices are according to their eigenvalues, or by imposing conditions on the product of the matrix with other matrices. Finally, many domains, both in mathematics and other sciences including physics and chemistry, have particular matrices that are applied chiefly in these areas.

Constant matrices

The list below comprises matrices whose elements are constant for any given dimension (size) of matrix. The matrix entries will be denoted aij. The table below uses the Kronecker delta δij for two integers i and j which is 1 if i = j and 0 else.

NameExplanationSymbolic description of the entriesNotes
Commutation matrix The matrix of the linear map that maps a matrix to its transposeSee Vectorization
Duplication matrix The matrix of the linear map mapping the vector of the distinct entries of a symmetric matrix to the vector of all entries of the matrixSee Vectorization
Elimination matrix The matrix of the linear map mapping the vector of the entries of a matrix to the vector of a part of the entries (for example the vector of the entries that are not below the main diagonal)See vectorization
Exchange matrix The binary matrix with ones on the anti-diagonal, and zeroes everywhere else.aij = δn+1−i,jA permutation matrix.
Hilbert matrix aij = (i + j  1)−1.A Hankel matrix.
Identity matrix A square diagonal matrix, with all entries on the main diagonal equal to 1, and the rest 0.aij = δij
Lehmer matrix aij = min(i, j) ÷ max(i, j).A positive symmetric matrix.
Matrix of ones A matrix with all entries equal to one.aij = 1.
Pascal matrix A matrix containing the entries of Pascal's triangle.
Pauli matrices A set of three 2 × 2 complex Hermitian and unitary matrices. When combined with the I2 identity matrix, they form an orthogonal basis for the 2 × 2 complex Hermitian matrices.
Redheffer matrix Encodes a Dirichlet convolution. Matrix entries are given by the divisor function; entires of the inverse are given by the Möbius function.aij are 1 if i divides j or if j = 1; otherwise, aij = 0.A (0, 1)-matrix.
Shift matrix A matrix with ones on the superdiagonal or subdiagonal and zeroes elsewhere.aij = δi+1,j or aij = δi−1,jMultiplication by it shifts matrix elements by one position.
Zero matrix A matrix with all entries equal to zero.aij = 0.

Specific patterns for entries

The following lists matrices whose entries are subject to certain conditions. Many of them apply to square matrices only, that is matrices with the same number of columns and rows. The main diagonal of a square matrix is the diagonal joining the upper left corner and the lower right one or equivalently the entries ai,i. The other diagonal is called anti-diagonal (or counter-diagonal).

NameExplanationNotes, references
(0,1)-matrix A matrix with all elements either 0 or 1.Synonym for binary matrix or logical matrix.
Alternant matrix A matrix in which successive columns have a particular function applied to their entries.
Alternating sign matrix A square matrix with entries 0, 1 and 1 such that the sum of each row and column is 1 and the nonzero entries in each row and column alternate in sign.
Anti-diagonal matrix A square matrix with all entries off the anti-diagonal equal to zero.
Anti-Hermitian matrix Synonym for skew-Hermitian matrix.
Anti-symmetric matrix Synonym for skew-symmetric matrix.
Arrowhead matrix A square matrix containing zeros in all entries except for the first row, first column, and main diagonal.
Band matrix A square matrix whose non-zero entries are confined to a diagonal band.
Bidiagonal matrix A matrix with elements only on the main diagonal and either the superdiagonal or subdiagonal.Sometimes defined differently, see article.
Binary matrix A matrix whose entries are all either 0 or 1.Synonym for (0,1)-matrix or logical matrix. [1]
Bisymmetric matrix A square matrix that is symmetric with respect to its main diagonal and its main cross-diagonal.
Block-diagonal matrix A block matrix with entries only on the diagonal.
Block matrix A matrix partitioned in sub-matrices called blocks.
Block tridiagonal matrix A block matrix which is essentially a tridiagonal matrix but with submatrices in place of scalar elements.
Boolean matrix A matrix whose entries are taken from a Boolean algebra.
Cauchy matrix A matrix whose elements are of the form 1/(xi + yj) for (xi), (yj) injective sequences (i.e., taking every value only once).
Centrosymmetric matrix A matrix symmetric about its center; i.e., aij = ani+1,nj+1.
Circulant matrix A matrix where each row is a circular shift of its predecessor.
Conference matrix A square matrix with zero diagonal and +1 and −1 off the diagonal, such that CTC is a multiple of the identity matrix.
Complex Hadamard matrix A matrix with all rows and columns mutually orthogonal, whose entries are unimodular.
Compound matrix A matrix whose entries are generated by the determinants of all minors of a matrix.
Copositive matrix A square matrix A with real coefficients, such that is nonnegative for every nonnegative vector x
Diagonally dominant matrix A matrix whose entries satisfy .
Diagonal matrix A square matrix with all entries outside the main diagonal equal to zero.
Discrete Fourier-transform matrix Multiplying by a vector gives the DFT of the vector as result.
Elementary matrix A square matrix derived by applying an elementary row operation to the identity matrix.
Equivalent matrix A matrix that can be derived from another matrix through a sequence of elementary row or column operations.
Frobenius matrix A square matrix in the form of an identity matrix but with arbitrary entries in one column below the main diagonal.
GCD matrix The matrix having the greatest common divisor as its entry, where .
Generalized permutation matrix A square matrix with precisely one nonzero element in each row and column.
Hadamard matrix A square matrix with entries +1, −1 whose rows are mutually orthogonal.
Hankel matrix A matrix with constant skew-diagonals; also an upside down Toeplitz matrix.A square Hankel matrix is symmetric.
Hermitian matrix A square matrix which is equal to its conjugate transpose, A = A*.
Hessenberg matrix An "almost" triangular matrix, for example, an upper Hessenberg matrix has zero entries below the first subdiagonal.
Hollow matrix A square matrix whose main diagonal comprises only zero elements.
Integer matrix A matrix whose entries are all integers.
Logical matrix A matrix with all entries either 0 or 1.Synonym for (0,1)-matrix, binary matrix or Boolean matrix. Can be used to represent a k-adic relation.
Markov matrix A matrix of non-negative real numbers, such that the entries in each row sum to 1.
Metzler matrix A matrix whose off-diagonal entries are non-negative.
Monomial matrix A square matrix with exactly one non-zero entry in each row and column.Synonym for generalized permutation matrix.
Moore matrix A row consists of a, aq, aq², etc., and each row uses a different variable.
Nonnegative matrix A matrix with all nonnegative entries.
Null-symmetric matrixA square matrix whose null space (or kernel) is equal to its transpose, N(A) = N(AT) or ker(A) = ker(AT).Synonym for kernel-symmetric matrices. Examples include (but not limited to) symmetric, skew-symmetric, and normal matrices.
Null-Hermitian matrixA square matrix whose null space (or kernel) is equal to its conjugate transpose, N(A)=N(A*) or ker(A)=ker(A*).Synonym for kernel-Hermitian matrices. Examples include (but not limited) to Hermitian, skew-Hermitian matrices, and normal matrices.
Partitioned matrix A matrix partitioned into sub-matrices, or equivalently, a matrix whose entries are themselves matrices rather than scalars.Synonym for block matrix.
Parisi matrix A block-hierarchical matrix. It consist of growing blocks placed along the diagonal, each block is itself a Parisi matrix of a smaller size.In theory of spin-glasses is also known as a replica matrix.
Pentadiagonal matrix A matrix with the only nonzero entries on the main diagonal and the two diagonals just above and below the main one.
Permutation matrix A matrix representation of a permutation, a square matrix with exactly one 1 in each row and column, and all other elements 0.
Persymmetric matrix A matrix that is symmetric about its northeast–southwest diagonal, i.e., aij = anj+1,ni+1.
Polynomial matrix A matrix whose entries are polynomials.
Positive matrix A matrix with all positive entries.
Quaternionic matrix A matrix whose entries are quaternions.
Random matrix A matrix whose entries are random variables
Sign matrix A matrix whose entries are either +1, 0, or −1.
Signature matrix A diagonal matrix where the diagonal elements are either +1 or −1.
Single-entry matrix A matrix where a single element is one and the rest of the elements are zero.
Skew-Hermitian matrix A square matrix which is equal to the negative of its conjugate transpose, A* = −A.
Skew-symmetric matrix A matrix which is equal to the negative of its transpose, AT = −A.
Skyline matrix A rearrangement of the entries of a banded matrix which requires less space.
Sparse matrix A matrix with relatively few non-zero elements.Sparse matrix algorithms can tackle huge sparse matrices that are utterly impractical for dense matrix algorithms.
Symmetric matrix A square matrix which is equal to its transpose, A = AT (ai,j = aj,i).
Toeplitz matrix A matrix with constant diagonals.
Totally positive matrix A matrix with determinants of all its square submatrices positive.
Triangular matrix A matrix with all entries above the main diagonal equal to zero (lower triangular) or with all entries below the main diagonal equal to zero (upper triangular).
Tridiagonal matrix A matrix with the only nonzero entries on the main diagonal and the diagonals just above and below the main one.
X–Y–Z matrixA generalization to three dimensions of the concept of two-dimensional array
Vandermonde matrix A row consists of 1, a, a2, a3, etc., and each row uses a different variable.
Walsh matrix A square matrix, with dimensions a power of 2, the entries of which are +1 or −1, and the property that the dot product of any two distinct rows (or columns) is zero.
Z-matrix A matrix with all off-diagonal entries less than zero.

Matrices satisfying some equations

A number of matrix-related notions is about properties of products or inverses of the given matrix. The matrix product of a m-by-n matrix A and a n-by-k matrix B is the m-by-k matrix C given by

[2]

This matrix product is denoted AB. Unlike the product of numbers, matrix products are not commutative, that is to say AB need not be equal to BA. [2] A number of notions are concerned with the failure of this commutativity. An inverse of square matrix A is a matrix B (necessarily of the same dimension as A) such that AB = I. Equivalently, BA = I. An inverse need not exist. If it exists, B is uniquely determined, and is also called the inverse of A, denoted A1.

NameExplanationNotes
Circular matrix or Coninvolutory matrix A matrix whose inverse is equal to its entrywise complex conjugate: A1 = A.Compare with unitary matrices.
Congruent matrix Two matrices A and B are congruent if there exists an invertible matrix P such that PTAP = B.Compare with similar matrices.
EP matrix or Range-Hermitian matrixA square matrix that commutes with its Moore–Penrose inverse: AA+ = A+A.
Idempotent matrix or
Projection Matrix
A matrix that has the property A² = AA = A.The name projection matrix inspires from the observation of projection of a point multiple
times onto a subspace(plane or a line) giving the same result as one projection.
Invertible matrix A square matrix having a multiplicative inverse, that is, a matrix B such that AB = BA = I.Invertible matrices form the general linear group.
Involutory matrix A square matrix which is its own inverse, i.e., AA = I. Signature matrices, Householder matrices (Also known as 'reflection matrices'
to reflect a point about a plane or line) have this property.
Isometric matrix A matrix that preserves distances, i.e., a matrix that satisfies A*A = I where A* denotes the conjugate transpose of A.
Nilpotent matrix A square matrix satisfying Aq = 0 for some positive integer q.Equivalently, the only eigenvalue of A is 0.
Normal matrix A square matrix that commutes with its conjugate transpose: AA = AAThey are the matrices to which the spectral theorem applies.
Orthogonal matrix A matrix whose inverse is equal to its transpose, A1 = AT.They form the orthogonal group.
Orthonormal matrix A matrix whose columns are orthonormal vectors.
Partially Isometric matrix A matrix that is an isometry on the orthogonal complement of its kernel. Equivalently, a matrix that satisfies AA*A = A.Equivalently, a matrix with singular values that are either 0 or 1.
Singular matrix A square matrix that is not invertible.
Unimodular matrix An invertible matrix with entries in the integers (integer matrix)Necessarily the determinant is +1 or 1.
Unipotent matrix A square matrix with all eigenvalues equal to 1.Equivalently, AI is nilpotent. See also unipotent group.
Unitary matrix A square matrix whose inverse is equal to its conjugate transpose, A1 = A*.
Totally unimodular matrix A matrix for which every non-singular square submatrix is unimodular. This has some implications in the linear programming relaxation of an integer program.
Weighing matrix A square matrix the entries of which are in {0, 1, 1}, such that AAT = wI for some positive integer w.

Matrices with conditions on eigenvalues or eigenvectors

NameExplanationNotes
Convergent matrix A square matrix whose successive powers approach the zero matrix.Its eigenvalues have magnitude less than one.
Defective matrix A square matrix that does not have a complete basis of eigenvectors, and is thus not diagonalizable.
Derogatory matrix A square matrix whose minimal polynomial is of order less than n. Equivalently, at least one of its eigenvalues has at least two Jordan blocks. [3]
Diagonalizable matrix A square matrix similar to a diagonal matrix.It has an eigenbasis, that is, a complete set of linearly independent eigenvectors.
Hurwitz matrix A matrix whose eigenvalues have strictly negative real part. A stable system of differential equations may be represented by a Hurwitz matrix.
M-matrix A Z-matrix with eigenvalues whose real parts are nonnegative.
Positive-definite matrix A Hermitian matrix with every eigenvalue positive.
Stability matrix Synonym for Hurwitz matrix.
Stieltjes matrix A real symmetric positive definite matrix with nonpositive off-diagonal entries.Special case of an M-matrix.

Matrices generated by specific data

NameDefinitionComments
Adjugate matrix Transpose of the cofactor matrix The inverse of a matrix is its adjugate matrix divided by its determinant
Augmented matrix Matrix whose rows are concatenations of the rows of two smaller matricesUsed for performing the same row operations on two matrices
Bézout matrix Square matrix whose determinant is the resultant of two polynomialsSee also Sylvester matrix
Carleman matrix Infinite matrix of the Taylor coefficients of an analytic function and its integer powersThe composition of two functions can be expressed as the product of their Carleman matrices
Cartan matrix A matrix associated with either a finite-dimensional associative algebra, or a semisimple Lie algebra
Cofactor matrix Formed by the cofactors of a square matrix, that is, the signed minors, of the matrix Transpose of the Adjugate matrix
Companion matrix A matrix having the coefficients of a polynomial as last column, and having the polynomial as its characteristic polynomial
Coxeter matrix A matrix which describes the relations between the involutions that generate a Coxeter group
Distance matrix The square matrix formed by the pairwise distances of a set of points Euclidean distance matrix is a special case
Euclidean distance matrix A matrix that describes the pairwise distances between points in Euclidean space See also distance matrix
Fundamental matrix The matrix formed from the fundamental solutions of a system of linear differential equations
Generator matrix In Coding theory, a matrix whose rows span a linear code
Gramian matrix The symmetric matrix of the pairwise inner products of a set of vectors in an inner product space
Hessian matrix The square matrix of second partial derivatives of a function of several variables
Householder matrix The matrix of a reflection with respect to a hyperplane passing through the origin
Jacobian matrix The matrix of the partial derivatives of a function of several variables
Moment matrix Used in statistics and Sum-of-squares optimization
Payoff matrix A matrix in game theory and economics, that represents the payoffs in a normal form game where players move simultaneously
Pick matrix A matrix that occurs in the study of analytical interpolation problems
Rotation matrix A matrix representing a rotation
Seifert matrix A matrix in knot theory, primarily for the algebraic analysis of topological properties of knots and links. Alexander polynomial
Shear matrix The matrix of a shear transformation
Similarity matrix A matrix of scores which express the similarity between two data points Sequence alignment
Sylvester matrix A square matrix whose entries come from the coefficients of two polynomials The Sylvester matrix is nonsingular if and only if the two polynomials are coprime to each other
Symplectic matrix The real matrix of a symplectic transformation
Transformation matrix The matrix of a linear transformation or a geometric transformation
Wedderburn matrix A matrix of the form , used for rank-reduction & biconjugate decompositionsAnalysis of matrix decompositions

Matrices used in statistics

The following matrices find their main application in statistics and probability theory.

Matrices used in graph theory

The following matrices find their main application in graph and network theory.

Matrices used in science and engineering

Specific matrices

See also

Notes

  1. Hogben  2006 ,Ch. 31.3.
  2. 1 2 Weisstein, Eric W. "Matrix Multiplication". mathworld.wolfram.com. Retrieved 2020-09-07.
  3. "Non-derogatory matrix - Encyclopedia of Mathematics". encyclopediaofmath.org. Retrieved 2020-09-07.

Related Research Articles

In linear algebra, the identity matrix of size is the square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the object remains unchanged by the transformation. In other contexts, it is analogous to multiplying by the number 1.

<span class="mw-page-title-main">Matrix multiplication</span> Mathematical operation in linear algebra

In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices A and B is denoted as AB.

<span class="mw-page-title-main">Square matrix</span> Matrix with the same number of rows and columns

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.

In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it is a diagonal matrix called scalar matrix, for example, . In geometry, a diagonal matrix may be used as a scaling matrix, since matrix multiplication with it results in changing scale (size) and possibly also shape; only a scalar matrix results in uniform change in scale.

In mathematics, a Hermitian matrix is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th row and i-th column, for all indices i and j:

<span class="mw-page-title-main">Covariance matrix</span> Measure of covariance of components of a random vector

In probability theory and statistics, a covariance matrix is a square matrix giving the covariance between each pair of elements of a given random vector.

In mathematics, especially the usage of linear algebra in mathematical physics, Einstein notation is a notational convention that implies summation over a set of indexed terms in a formula, thus achieving brevity. As part of mathematics it is a notational subset of Ricci calculus; however, it is often used in physics applications that do not distinguish between tangent and cotangent spaces. It was introduced to physics by Albert Einstein in 1916.

In linear algebra, the permanent of a square matrix is a function of the matrix similar to the determinant. The permanent, as well as the determinant, is a polynomial in the entries of the matrix. Both are special cases of a more general function of a matrix called the immanant.

In linear algebra, an n-by-n square matrix A is called invertible, if there exists an n-by-n square matrix B such that

In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices:

In mathematics, particularly in matrix theory, a permutation matrix is a square binary matrix that has exactly one entry of 1 in each row and each column and 0s elsewhere. Each such matrix, say P, represents a permutation of m elements and, when used to multiply another matrix, say A, results in permuting the rows or columns of the matrix A.

In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph. The elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph.

In mathematics, a unimodular matrixM is a square integer matrix having determinant +1 or −1. Equivalently, it is an integer matrix that is invertible over the integers: there is an integer matrix N that is its inverse. Thus every equation Mx = b, where M and b both have integer components and M is unimodular, has an integer solution. The n × n unimodular matrices form a group called the n × n general linear group over , which is denoted .

In linear algebra, a circulant matrix is a square matrix in which all row vectors are composed of the same elements and each row vector is rotated one element to the right relative to the preceding row vector. It is a particular kind of Toeplitz matrix.

In the mathematical field of graph theory, the Laplacian matrix, also called the graph Laplacian, admittance matrix, Kirchhoff matrix or discrete Laplacian, is a matrix representation of a graph. Named after Pierre-Simon Laplace, the graph Laplacian matrix can be viewed as a matrix form of the negative discrete Laplace operator on a graph approximating the negative continuous Laplacian obtained by the finite difference method.

In matrix theory, the Perron–Frobenius theorem, proved by Oskar Perron (1907) and Georg Frobenius (1912), asserts that a real square matrix with positive entries has a unique eigenvalue of largest magnitude and that eigenvalue is real. The corresponding eigenvector can be chosen to have strictly positive components, and also asserts a similar statement for certain classes of nonnegative matrices. This theorem has important applications to probability theory ; to the theory of dynamical systems ; to economics ; to demography ; to social networks ; to Internet search engines (PageRank); and even to ranking of American football teams. The first to discuss the ordering of players within tournaments using Perron–Frobenius eigenvectors is Edmund Landau.

In mathematics, especially in probability and combinatorics, a doubly stochastic matrix (also called bistochastic matrix) is a square matrix of nonnegative real numbers, each of whose rows and columns sums to 1, i.e.,

A logical matrix, binary matrix, relation matrix, Boolean matrix, or (0, 1)-matrix is a matrix with entries from the Boolean domain B = {0, 1}. Such a matrix can be used to represent a binary relation between a pair of finite sets. It is an important tool in combinatorial mathematics and theoretical computer science.

<span class="mw-page-title-main">Matrix (mathematics)</span> Array of numbers

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

Birkhoff's algorithm is an algorithm for decomposing a bistochastic matrix into a convex combination of permutation matrices. It was published by Garrett Birkhoff in 1946. It has many applications. One such application is for the problem of fair random assignment: given a randomized allocation of items, Birkhoff's algorithm can decompose it into a lottery on deterministic allocations.

References