A polynomial P is annihilating or called an annihilating polynomial in linear algebra and operator theory if the polynomial considered as a function of the linear operator or a matrix A evaluates to zero, i.e., is such that P(A) = 0.
Note that all characteristic polynomials and minimal polynomials of A are annihilating polynomials. In fact, every annihilating polynomial is the multiple of the minimal polynomial of an operator A. [1] [2]
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base. The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.
In mathematics, an element of a ring is called nilpotent if there exists some positive integer , called the index, such that .
In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function.
In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal, and with identical diagonal entries to the left and below them.
In linear algebra, two n-by-n matrices A and B are called similar if there exists an invertible n-by-n matrix P such that
Invariant theory is a branch of abstract algebra dealing with actions of groups on algebraic varieties, such as vector spaces, from the point of view of their effect on functions. Classically, the theory dealt with the question of explicit description of polynomial functions that do not change, or are invariant, under the transformations from a given linear group. For example, if we consider the action of the special linear group SLn on the space of n by n matrices by left multiplication, then the determinant is an invariant of this action because the determinant of A X equals the determinant of X, when A is in SLn.
In mathematics, especially in the field of algebra, a polynomial ring or polynomial algebra is a ring formed from the set of polynomials in one or more indeterminates with coefficients in another ring, often a field.
In abstract algebra, a representation of an associative algebra is a module for that algebra. Here an associative algebra is a ring. If the algebra is not unital, it may be made so in a standard way ; there is no essential difference between modules for the resulting unital ring, in which the identity acts by the identity mapping, and representations of the algebra.
In mathematics, a functional calculus is a theory allowing one to apply mathematical functions to mathematical operators. It is now a branch of the field of functional analysis, connected with spectral theory.
In linear algebra, the Frobenius normal form or rational canonical form of a square matrix A with entries in a field F is a canonical form for matrices obtained by conjugation by invertible matrices over F. The form reflects a minimal decomposition of the vector space into subspaces that are cyclic for A. Since only one normal form can be reached from a given matrix, a matrix B is similar to A if and only if it has the same rational canonical form as A. Since this form can be found without any operations that might change when extending the field F, notably without factoring polynomials, this shows that whether two matrices are similar does not change upon field extensions. The form is named after German mathematician Ferdinand Georg Frobenius.
In linear algebra, similarity invariance is a property exhibited by a function whose value is unchanged under similarities of its domain. That is, is invariant under similarities if where is a matrix similar to A. Examples of such functions include the trace, determinant, characteristic polynomial, and the minimal polynomial.
In mathematics, a D-module is a module over a ring D of differential operators. The major interest of such D-modules is as an approach to the theory of linear partial differential equations. Since around 1970, D-module theory has been built up, mainly as a response to the ideas of Mikio Sato on algebraic analysis, and expanding on the work of Sato and Joseph Bernstein on the Bernstein–Sato polynomial.
In mathematics, and more specifically in analysis, a holonomic function is a smooth function of several variables that is a solution of a system of linear homogeneous differential equations with polynomial coefficients and satisfies a suitable dimension condition in terms of D-modules theory. More precisely, a holonomic function is an element of a holonomic module of smooth functions. Holonomic functions can also be described as differentiably finite functions, also known as D-finite functions. When a power series in the variables is the Taylor expansion of a holonomic function, the sequence of its coefficients, in one or several indices, is also called holonomic. Holonomic sequences are also called P-recursive sequences: they are defined recursively by multivariate recurrences satisfied by the whole sequence and by suitable specializations of it. The situation simplifies in the univariate case: any univariate sequence that satisfies a linear homogeneous recurrence relation with polynomial coefficients, or equivalently a linear homogeneous difference equation with polynomial coefficients, is holonomic.
In mathematics, a linear operator T : V → V on a vector space V is semisimple if every T-invariant subspace has a complementary T-invariant subspace. If T is a semisimple linear operator on V, then V is a semisimple representation of T. Equivalently, a linear operator is semisimple if its minimal polynomial is a product of distinct irreducible polynomials.
In linear algebra, the minimal polynomialμA of an n × n matrix A over a field F is the monic polynomial P over F of least degree such that P(A) = 0. Any other polynomial Q with Q(A) = 0 is a (polynomial) multiple of μA.
In mathematics, specifically linear algebra, the Jordan–Chevalley decomposition, named after Camille Jordan and Claude Chevalley, expresses a linear operator in a unique way as the sum of two other linear operators which are simpler to understand. Specifically, one part is potentially diagonalisable and the other is nilpotent. The two parts are polynomials in the operator, which makes them behave nicely in algebraic manipulations.
In mathematics, a matrix polynomial is a polynomial with square matrices as variables. Given an ordinary, scalar-valued polynomial
Julius Bogdan Borcea was a Romanian Swedish mathematician. His scientific work included vertex operator algebra and zero distribution of polynomials and entire functions, via correlation inequalities and statistical mechanics.