Central polynomial

Last updated

In algebra, a central polynomial for n-by-n matrices is a polynomial in non-commuting variables that is non-constant but yields a scalar matrix whenever it is evaluated at n-by-n matrices. That such polynomials exist for any square matrices was discovered in 1970 independently by Formanek and Razmyslov. The term "central" is because the evaluation of a central polynomial has the image lying in the center of the matrix ring over any commutative ring. The notion has an application to the theory of polynomial identity rings.

Example: is a central polynomial for 2-by-2-matrices. Indeed, by the Cayley–Hamilton theorem, one has that for any 2-by-2-matrices x and y.

See also

Related Research Articles

In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the linear map represented by the matrix is an isomorphism. The determinant of a product of matrices is the product of their determinants (the preceding property is a corollary of this one). The determinant of a matrix A is denoted det(A), det A, or |A|.

<span class="mw-page-title-main">Lie algebra</span> Algebraic structure used in analysis

In mathematics, a Lie algebra is a vector space together with an operation called the Lie bracket, an alternating bilinear map , that satisfies the Jacobi identity. Otherwise said, a Lie algebra is an algebra over a field where the multiplication operation is now called Lie bracket and has two additional properties: it is alternating and satisfies the Jacobi identity. The Lie bracket of two vectors and is denoted . The Lie bracket does not need to be associative, meaning that the Lie algebra can be non associative. Given an associative algebra, a Lie bracket can be and is often defined through the commutator, namely defining correctly defines a Lie bracket in addition to the already existing multiplication operation.

In mathematics, a polynomial is an expression consisting of indeterminates and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An example of a polynomial of a single indeterminate x is x2 − 4x + 7. An example with three indeterminates is x3 + 2xyz2yz + 1.

In mathematics, the concept of an inverse element generalises the concepts of opposite and reciprocal of numbers.

<span class="mw-page-title-main">Ring (mathematics)</span> Algebraic structure with addition and multiplication

In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist. In other words, a ring is a set equipped with two binary operations satisfying properties analogous to those of addition and multiplication of integers. Ring elements may be numbers such as integers or complex numbers, but they may also be non-numerical objects such as polynomials, square matrices, functions, and power series.

<span class="mw-page-title-main">Square matrix</span> Matrix with the same number of rows and columns

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.

<span class="mw-page-title-main">Cayley–Hamilton theorem</span> Every square matrix over a commutative ring satisfies its own characteristic equation

In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.

In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base. The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.

Ring theory is the branch of mathematics in which rings are studied: that is, structures supporting both an addition and a multiplication operation. This is a glossary of some terms of the subject.

In algebra, a unit or invertible element of a ring is an invertible element for the multiplication of the ring. That is, an element u of a ring R is a unit if there exists v in R such that

In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function. It is used to solve systems of linear differential equations. In the theory of Lie groups, the matrix exponential gives the exponential map between a matrix Lie algebra and the corresponding Lie group.

In linear algebra, a rotation matrix is a transformation matrix that is used to perform a rotation in Euclidean space. For example, using the convention below, the matrix

In abstract algebra, a Jordan algebra is a nonassociative algebra over a field whose multiplication satisfies the following axioms:

  1. .

In mathematics, a zero element is one of several generalizations of the number zero to other algebraic structures. These alternate meanings may or may not reduce to the same thing, depending on the context.

In mathematics, a matrix polynomial is a polynomial with square matrices as variables. Given an ordinary, scalar-valued polynomial

<span class="mw-page-title-main">Matrix (mathematics)</span> Array of numbers

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

In ring theory, a branch of mathematics, a ring R is a polynomial identity ring if there is, for some N > 0, an element P ≠ 0 of the free algebra, ZX1, X2, ..., XN, over the ring of integers in N variables X1, X2, ..., XN such that

In algebra, the Amitsur–Levitzki theorem states that the algebra of n × n matrices over a commutative ring satisfies a certain identity of degree 2n. It was proved by Amitsur and Levitsky (1950). In particular matrix rings are polynomial identity rings such that the smallest identity they satisfy has degree exactly 2n.

In mathematics, Manin matrices, named after Yuri Manin who introduced them around 1987–88, are a class of matrices with elements in a not-necessarily commutative ring, which in a certain sense behave like matrices whose elements commute. In particular there is natural definition of the determinant for them and most linear algebra theorems like Cramer's rule, Cayley–Hamilton theorem, etc. hold true for them. Any matrix with commuting elements is a Manin matrix. These matrices have applications in representation theory in particular to Capelli's identity, Yangian and quantum integrable systems.

In mathematics, the Hamiltonian cycle polynomial of an n×n-matrix is a polynomial in its entries, defined as

References