In mathematics, a generalized permutation matrix (or monomial matrix) is a matrix with the same nonzero pattern as a permutation matrix, i.e. there is exactly one nonzero entry in each row and each column. Unlike a permutation matrix, where the nonzero entry must be 1, in a generalized permutation matrix the nonzero entry can be any nonzero value. An example of a generalized permutation matrix is
An invertible matrix A is a generalized permutation matrix if and only if it can be written as a product of an invertible diagonal matrix D and an (implicitly invertible) permutation matrix P: i.e.,
The set of n × n generalized permutation matrices with entries in a field F forms a subgroup of the general linear group GL(n, F), in which the group of nonsingular diagonal matrices Δ(n, F) forms a normal subgroup. Indeed, over all fields except GF(2), the generalized permutation matrices are the normalizer of the diagonal matrices, meaning that the generalized permutation matrices are the largest subgroup of GL(n, F) in which diagonal matrices are normal.
The abstract group of generalized permutation matrices is the wreath product of F× and Sn. Concretely, this means that it is the semidirect product of Δ(n, F) by the symmetric group Sn:
where Sn acts by permuting coordinates and the diagonal matrices Δ(n, F) are isomorphic to the n-fold product (F×)n.
To be precise, the generalized permutation matrices are a (faithful) linear representation of this abstract wreath product: a realization of the abstract group as a subgroup of matrices.
One can generalize further by allowing the entries to lie in a ring, rather than in a field. In that case if the non-zero entries are required to be units in the ring, one again obtains a group. On the other hand, if the non-zero entries are only required to be non-zero, but not necessarily invertible, this set of matrices forms a semigroup instead.
One may also schematically allow the non-zero entries to lie in a group G, with the understanding that matrix multiplication will only involve multiplying a single pair of group elements, not "adding" group elements. This is an abuse of notation, since element of matrices being multiplied must allow multiplication and addition, but is suggestive notion for the (formally correct) abstract group (the wreath product of the group G by the symmetric group).
A signed permutation matrix is a generalized permutation matrix whose nonzero entries are ±1, and are the integer generalized permutation matrices with integer inverse.
Monomial matrices occur in representation theory in the context of monomial representations. A monomial representation of a group G is a linear representation ρ : G → GL(n, F) of G (here F is the defining field of the representation) such that the image ρ(G) is a subgroup of the group of monomial matrices.
In mathematics, an abelian group, also called a commutative group, is a group in which the result of applying the group operation to two group elements does not depend on the order in which they are written. That is, the group operation is commutative. With addition as an operation, the integers and the real numbers form abelian groups, and the concept of an abelian group may be viewed as a generalization of these examples. Abelian groups are named after Niels Henrik Abel.
In mathematics, the determinant is a scalar-valued function of the entries of a square matrix. The determinant of a matrix A is commonly denoted det(A), det A, or |A|. Its value characterizes some properties of the matrix and the linear map represented, on a given basis, by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the corresponding linear map is an isomorphism.
In abstract algebra, the symmetric group defined over any set is the group whose elements are all the bijections from the set to itself, and whose group operation is the composition of functions. In particular, the finite symmetric group defined over a finite set of symbols consists of the permutations that can be performed on the symbols. Since there are such permutation operations, the order of the symmetric group is .
In linear algebra, the trace of a square matrix A, denoted tr(A), is the sum of the elements on its main diagonal, . It is only defined for a square matrix.
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.
In mathematics, the general linear group of degree n is the set of n×n invertible matrices, together with the operation of ordinary matrix multiplication. This forms a group, because the product of two invertible matrices is again invertible, and the inverse of an invertible matrix is invertible, with the identity matrix as the identity element of the group. The group is so named because the columns of an invertible matrix are linearly independent, hence the vectors/points they define are in general linear position, and matrices in the general linear group take points in general linear position to points in general linear position.
In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.
In mathematics, the orthogonal group in dimension n, denoted O(n), is the group of distance-preserving transformations of a Euclidean space of dimension n that preserve a fixed point, where the group operation is given by composing transformations. The orthogonal group is sometimes called the general orthogonal group, by analogy with the general linear group. Equivalently, it is the group of n × n orthogonal matrices, where the group operation is given by matrix multiplication (an orthogonal matrix is a real matrix whose inverse equals its transpose). The orthogonal group is an algebraic group and a Lie group. It is compact.
In mathematics, a Hermitian matrix is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th row and i-th column, for all indices i and j:
In linear algebra, an invertible matrix is a square matrix which has an inverse. In other words, if some other matrix is multiplied by the invertible matrix, the result can be multiplied by an inverse to undo the operation. Invertible matrices are the same size as their inverse.
In linear algebra, a square matrix is called diagonalizable or non-defective if it is similar to a diagonal matrix. That is, if there exists an invertible matrix and a diagonal matrix such that . This is equivalent to . This property exists for any linear map: for a finite-dimensional vector space , a linear map is called diagonalizable if there exists an ordered basis of consisting of eigenvectors of . These definitions are equivalent: if has a matrix representation as above, then the column vectors of form a basis consisting of eigenvectors of , and the diagonal entries of are the corresponding eigenvalues of ; with respect to this eigenvector basis, is represented by .
In mathematics, the indefinite orthogonal group, O(p, q) is the Lie group of all linear transformations of an n-dimensional real vector space that leave invariant a nondegenerate, symmetric bilinear form of signature (p, q), where n = p + q. It is also called the pseudo-orthogonal group or generalized orthogonal group. The dimension of the group is n(n − 1)/2.
In mathematics, specifically linear algebra, the Cauchy–Binet formula, named after Augustin-Louis Cauchy and Jacques Philippe Marie Binet, is an identity for the determinant of the product of two rectangular matrices of transpose shapes. It generalizes the statement that the determinant of a product of square matrices is equal to the product of their determinants. The formula is valid for matrices with the entries from any commutative ring.
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called lower triangular if all the entries above the main diagonal are zero. Similarly, a square matrix is called upper triangular if all the entries below the main diagonal are zero.
In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal, and the supradiagonal/upper diagonal. For example, the following matrix is tridiagonal:
In mathematics, the determinant of an m-by-m skew-symmetric matrix can always be written as the square of a polynomial in the matrix entries, a polynomial with integer coefficients that only depends on m. When m is odd, the polynomial is zero, and when m is even, it is a nonzero polynomial of degree m/2, and is unique up to multiplication by ±1. The convention on skew-symmetric tridiagonal matrices, given below in the examples, then determines one specific polynomial, called the Pfaffian polynomial. The value of this polynomial, when applied to the entries of a skew-symmetric matrix, is called the Pfaffian of that matrix. The term Pfaffian was introduced by Cayley, who indirectly named them after Johann Friedrich Pfaff.
Sylvester's law of inertia is a theorem in matrix algebra about certain properties of the coefficient matrix of a real quadratic form that remain invariant under a change of basis. Namely, if is a symmetric matrix, then for any invertible matrix , the number of positive, negative and zero eigenvalues of is constant. This result is particularly useful when is diagonal, as the inertia of a diagonal matrix can easily be obtained by looking at the sign of its diagonal elements.
In mathematics, the Bruhat decompositionG = BWB of certain algebraic groups G into cells can be regarded as a general expression of the principle of Gauss–Jordan elimination, which generically writes a matrix as a product of an upper triangular and lower triangular matrices—but with exceptional cases. It is related to the Schubert cell decomposition of flag varieties: see Weyl group for this.
In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.
In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, with elements or entries arranged in rows and columns, which is used to represent a mathematical object or property of such an object.