Generalized permutation matrix

Last updated

In mathematics, a generalized permutation matrix (or monomial matrix) is a matrix with the same nonzero pattern as a permutation matrix, i.e. there is exactly one nonzero entry in each row and each column. Unlike a permutation matrix, where the nonzero entry must be 1, in a generalized permutation matrix the nonzero entry can be any nonzero value. An example of a generalized permutation matrix is

Contents

Structure

An invertible matrix A is a generalized permutation matrix if and only if it can be written as a product of an invertible diagonal matrix D and an (implicitly invertible) permutation matrix P: i.e.,

Group structure

The set of n×n generalized permutation matrices with entries in a field F forms a subgroup of the general linear group GL(n, F), in which the group of nonsingular diagonal matrices Δ(n, F) forms a normal subgroup. Indeed, the generalized permutation matrices are the normalizer of the diagonal matrices, meaning that the generalized permutation matrices are the largest subgroup of GL(n, F) in which diagonal matrices are normal.

The abstract group of generalized permutation matrices is the wreath product of F× and Sn. Concretely, this means that it is the semidirect product of Δ(n, F) by the symmetric group Sn:

SnΔ(n, F),

where Sn acts by permuting coordinates and the diagonal matrices Δ(n, F) are isomorphic to the n-fold product (F×)n.

To be precise, the generalized permutation matrices are a (faithful) linear representation of this abstract wreath product: a realization of the abstract group as a subgroup of matrices.

Subgroups

Properties

where is the sign of the permutation associated with and are the diagonal elements of .

Generalizations

One can generalize further by allowing the entries to lie in a ring, rather than in a field. In that case if the non-zero entries are required to be units in the ring, one again obtains a group. On the other hand, if the non-zero entries are only required to be non-zero, but not necessarily invertible, this set of matrices forms a semigroup instead.

One may also schematically allow the non-zero entries to lie in a group G, with the understanding that matrix multiplication will only involve multiplying a single pair of group elements, not "adding" group elements. This is an abuse of notation, since element of matrices being multiplied must allow multiplication and addition, but is suggestive notion for the (formally correct) abstract group (the wreath product of the group G by the symmetric group).

Signed permutation group

A signed permutation matrix is a generalized permutation matrix whose nonzero entries are ±1, and are the integer generalized permutation matrices with integer inverse.

Properties

Applications

Monomial representations

Monomial matrices occur in representation theory in the context of monomial representations. A monomial representation of a group G is a linear representation ρ : G → GL(n, F) of G (here F is the defining field of the representation) such that the image ρ(G) is a subgroup of the group of monomial matrices.

Related Research Articles

Abelian group Commutative group (mathematics)

In mathematics, an abelian group, also called a commutative group, is a group in which the result of applying the group operation to two group elements does not depend on the order in which they are written. That is, the group operation is commutative. With addition as an operation, the integers and the real numbers form abelian groups, and the concept of an abelian group may be viewed as a generalization of these examples. Abelian groups are named after early 19th century mathematician Niels Henrik Abel.

In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It allows characterizing some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible, and the linear map represented by the matrix is an isomorphism. The determinant of a product of matrices is the product of their determinants . The determinant of a matrix A is denoted det(A), det A, or |A|.

Symmetric group

In abstract algebra, the symmetric group defined over any set is the group whose elements are all the bijections from the set to itself, and whose group operation is the composition of functions. In particular, the finite symmetric group defined over a finite set of symbols consists of the permutations that can be performed on the symbols. Since there are such permutation operations, the order of the symmetric group is .

In linear algebra, the trace of a square matrix A, denoted tr(A), is defined to be the sum of elements on the main diagonal of A.

In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.

General linear group n x n invertible matrices over a ring

In mathematics, the general linear group of degree n is the set of n×n invertible matrices, together with the operation of ordinary matrix multiplication. This forms a group, because the product of two invertible matrices is again invertible, and the inverse of an invertible matrix is invertible, with identity matrix as the identity element of the group. The group is so named because the columns of an invertible matrix are linearly independent, hence the vectors/points they define are in general linear position, and matrices in the general linear group take points in general linear position to points in general linear position.

Square matrix Matrix with the same number of rows and columns

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.

Orthogonal group Group of isometries of a Euclidean vector space or, more generally, of a vector space equipped with a quadratic form

In mathematics, the orthogonal group in dimension n, denoted O(n), is the group of distance-preserving transformations of a Euclidean space of dimension n that preserve a fixed point, where the group operation is given by composing transformations. The orthogonal group is sometimes called the general orthogonal group, by analogy with the general linear group. Equivalently, it is the group of n×n orthogonal matrices, where the group operation is given by matrix multiplication. The orthogonal group is an algebraic group and a Lie group. It is compact.

In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition

In linear algebra, an n-by-n square matrix A is called invertible, if there exists an n-by-n square matrix B such that

In linear algebra, a square matrix  is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix  and a diagonal matrix such that , or equivalently . For a finite-dimensional vector space , a linear map  is called diagonalizable if there exists an ordered basis of  consisting of eigenvectors of . These definitions are equivalent: if  has a matrix representation as above, then the column vectors of  form a basis consisting of eigenvectors of , and the diagonal entries of  are the corresponding eigenvalues of ; with respect to this eigenvector basis,  is represented by .Diagonalization is the process of finding the above  and .

In mathematics, specifically linear algebra, the Cauchy–Binet formula, named after Augustin-Louis Cauchy and Jacques Philippe Marie Binet, is an identity for the determinant of the product of two rectangular matrices of transpose shapes. It generalizes the statement that the determinant of a product of square matrices is equal to the product of their determinants. The formula is valid for matrices with the entries from any commutative ring.

In the mathematical discipline of linear algebra, a triangular matrix is a special kind of square matrix. A square matrix is called lower triangular if all the entries above the main diagonal are zero. Similarly, a square matrix is called upper triangular if all the entries below the main diagonal are zero.

In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements on the main diagonal, the first diagonal below this, and the first diagonal above the main diagonal only.

Sylvester's law of inertia is a theorem in matrix algebra about certain properties of the coefficient matrix of a real quadratic form that remain invariant under a change of basis. Namely, if A is the symmetric matrix that defines the quadratic form, and S is any invertible matrix such that D = SAST is diagonal, then the number of negative elements in the diagonal of D is always the same, for all such S; and the same goes for the number of positive elements.

In mathematics, the Bruhat decompositionG = BWB of certain algebraic groups G into cells can be regarded as a general expression of the principle of Gauss–Jordan elimination, which generically writes a matrix as a product of an upper triangular and lower triangular matrices—but with exceptional cases. It is related to the Schubert cell decomposition of Grassmannians: see Weyl group for this.

Frobenius group

In mathematics, a Frobenius group is a transitive permutation group on a finite set, such that no non-trivial element fixes more than one point and some non-trivial element fixes a point. They are named after F. G. Frobenius.

In numerical analysis and linear algebra, lower–upper (LU) decomposition or factorization factors a matrix as the product of a lower triangular matrix and an upper triangular matrix. The product sometimes includes a permutation matrix as well. LU decomposition can be viewed as the matrix form of Gaussian elimination. Computers usually solve square systems of linear equations using LU decomposition, and it is also a key step when inverting a matrix or computing the determinant of a matrix. The LU decomposition was introduced by the Polish mathematician Tadeusz Banachiewicz in 1938.

In linear algebra, eigendecomposition or sometimes spectral decomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way.

Matrix (mathematics) Two-dimensional array of numbers with specific operations

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns. For example, the dimension of the matrix below is 2 × 3, because there are two rows and three columns:

References