Polynomial matrix

Last updated

In mathematics, a polynomial matrix or matrix of polynomials is a matrix whose elements are univariate or multivariate polynomials. Equivalently, a polynomial matrix is a polynomial whose coefficients are matrices.

A univariate polynomial matrix P of degree p is defined as:

where denotes a matrix of constant coefficients, and is non-zero. An example 3×3 polynomial matrix, degree 2:

We can express this by saying that for a ring R, the rings and are isomorphic.

Properties

Note that polynomial matrices are not to be confused with monomial matrices, which are simply matrices with exactly one non-zero entry in each row and column.

If by λ we denote any element of the field over which we constructed the matrix, by I the identity matrix, and we let A be a polynomial matrix, then the matrix λI  A is the characteristic matrix of the matrix A. Its determinant, |λI  A| is the characteristic polynomial of the matrix A.

Related Research Articles

In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the linear map represented by the matrix is an isomorphism. The determinant of a product of matrices is the product of their determinants (the preceding property is a corollary of this one). The determinant of a matrix A is denoted det(A), det A, or |A|.

In mathematics, a symmetric matrix with real entries is positive-definite if the real number is positive for every nonzero real column vector where is the transpose of . More generally, a Hermitian matrix is positive-definite if the real number is positive for every nonzero complex column vector where denotes the conjugate transpose of

<span class="mw-page-title-main">Square matrix</span> Matrix with the same number of rows and columns

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.

In mathematics, a complex square matrix A is normal if it commutes with its conjugate transpose A*:

<span class="mw-page-title-main">Cayley–Hamilton theorem</span> Every square matrix over a commutative ring satisfies its own characteristic equation

In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.

<span class="mw-page-title-main">Special unitary group</span> Group of unitary matrices with determinant of 1

In mathematics, the special unitary group of degree n, denoted SU(n), is the Lie group of n × n unitary matrices with determinant 1.

In linear algebra, an n-by-n square matrix A is called invertible, if there exists an n-by-n square matrix B such that

In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base. The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.

In mathematics, a quadratic form is a polynomial with terms all of degree two. For example,

In mathematics, and in particular linear algebra, the Moore–Penrose inverse of a matrix is the most widely known generalization of the inverse matrix. It was independently described by E. H. Moore in 1920, Arne Bjerhammar in 1951, and Roger Penrose in 1955. Earlier, Erik Ivar Fredholm had introduced the concept of a pseudoinverse of integral operators in 1903. When referring to a matrix, the term pseudoinverse, without further specification, is often used to indicate the Moore–Penrose inverse. The term generalized inverse is sometimes used as a synonym for pseudoinverse.

In mathematics, the polar decomposition of a square real or complex matrix is a factorization of the form , where is a unitary matrix and is a positive semi-definite Hermitian matrix, both square and of the same size.

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor.

In mathematics, a Sylvester matrix is a matrix associated to two univariate polynomials with coefficients in a field or a commutative ring. The entries of the Sylvester matrix of two polynomials are coefficients of the polynomials. The determinant of the Sylvester matrix of two polynomials is their resultant, which is zero when the two polynomials have a common root or a non-constant common divisor.

In mathematics, the square root of a matrix extends the notion of square root from numbers to matrices. A matrix B is said to be a square root of A if the matrix product BB is equal to A.

In mathematics, a shear matrix or transvection is an elementary matrix that represents the addition of a multiple of one row or column to another. Such a matrix may be derived by taking the identity matrix and replacing one of the zero elements with a non-zero value.

In linear algebra, the Hermite normal form is an analogue of reduced echelon form for matrices over the integers Z. Just as reduced echelon form can be used to solve problems about the solution to the linear system Ax=b where x is in Rn, the Hermite normal form can solve problems about the solution to the linear system Ax=b where this time x is restricted to have integer coordinates only. Other applications of the Hermite normal form include integer programming, cryptography, and abstract algebra.

In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

<span class="mw-page-title-main">Matrix (mathematics)</span> Array of numbers

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

In algebra, linear equations and systems of linear equations over a field are widely studied. "Over a field" means that the coefficients of the equations and the solutions that one is looking for belong to a given field, commonly the real or the complex numbers. This article is devoted to the same problems where "field" is replaced by "commutative ring", or, typically "Noetherian integral domain".

In mathematics, particularly in linear algebra and applications, matrix analysis is the study of matrices and their algebraic properties. Some particular topics out of many include; operations defined on matrices, functions of matrices, and the eigenvalues of matrices.

References

  1. Friedland, S.; Melman, A. (2020). "A note on Hermitian positive semidefinite matrix polynomials". Linear Algebra and Its Applications. 598: 105–109. doi:10.1016/j.laa.2020.03.038.