Formal derivative

Last updated

In mathematics, the formal derivative is an operation on elements of a polynomial ring or a ring of formal power series that mimics the form of the derivative from calculus. Though they appear similar, the algebraic advantage of a formal derivative is that it does not rely on the notion of a limit, which is in general impossible to define for a ring. Many of the properties of the derivative are true of the formal derivative, but some, especially those that make numerical statements, are not.

Contents

Formal differentiation is used in algebra to test for multiple roots of a polynomial.

Definition

Fix a ring (not necessarily commutative) and let be the ring of polynomials over . (If is not commutative, this is the free algebra over a single indeterminate variable.)

Then the formal derivative is an operation on elements of , where if

then its formal derivative is

In the above definition, for any nonnegative integer and , is defined as usual in a ring: (with if ). [1]

This definition also works even if does not have a multiplicative identity (that is, is a rng).

Alternative axiomatic definition

One may also define the formal derivative axiomatically as the map satisfying the following properties.

  1. for all
  2. The normalization axiom,
  3. The map commutes with the addition operation in the polynomial ring,
  4. The map satisfies Leibniz's law with respect to the polynomial ring's multiplication operation,

One may prove that this axiomatic definition yields a well-defined map respecting all of the usual ring axioms.

The formula above (i.e. the definition of the formal derivative when the coefficient ring is commutative) is a direct consequence of the aforementioned axioms:

Properties

It can be verified that:

Note the order of the factors; when R is not commutative this is important.

These two properties make D a derivation on A (see module of relative differential forms for a discussion of a generalization).

Note that the formal derivative is not a ring homomorphism, because the product rule is different from saying (and it is not the case) that . However, it is a homomorphism (linear map) of R-modules, by the above rules.

Application to finding repeated factors

As in calculus, the derivative detects multiple roots. If R is a field then R[x] is a Euclidean domain, and in this situation we can define multiplicity of roots; for every polynomial f(x) in R[x] and every element r of R, there exists a nonnegative integer mr and a polynomial g(x) such that

where g(r)0. mr is the multiplicity of r as a root of f. It follows from the Leibniz rule that in this situation, mr is also the number of differentiations that must be performed on f(x) before r is no longer a root of the resulting polynomial. The utility of this observation is that although in general not every polynomial of degree n in R[x] has n roots counting multiplicity (this is the maximum, by the above theorem), we may pass to field extensions in which this is true (namely, algebraic closures). Once we do, we may uncover a multiple root that was not a root at all simply over R. For example, if R is the finite field with three elements, the polynomial

has no roots in R; however, its formal derivative () is zero since 3 = 0 in R and in any extension of R, so when we pass to the algebraic closure it has a multiple root that could not have been detected by factorization in R itself. Thus, formal differentiation allows an effective notion of multiplicity. This is important in Galois theory, where the distinction is made between separable field extensions (defined by polynomials with no multiple roots) and inseparable ones.

Correspondence to analytic derivative

When the ring R of scalars is commutative, there is an alternative and equivalent definition of the formal derivative, which resembles the one seen in differential calculus. The element Y–X of the ring R[X,Y] divides Yn – Xn for any nonnegative integer n, and therefore divides f(Y) – f(X) for any polynomial f in one indeterminate. If the quotient in R[X,Y] is denoted by g, then

It is then not hard to verify that g(X,X) (in R[X]) coincides with the formal derivative of f as it was defined above.

This formulation of the derivative works equally well for a formal power series, as long as the ring of coefficients is commutative.

Actually, if the division in this definition is carried out in the class of functions of continuous at , it will recapture the classical definition of the derivative. If it is carried out in the class of functions continuous in both and , we get uniform differentiability, and the function will be continuously differentiable. Likewise, by choosing different classes of functions (say, the Lipschitz class), we get different flavors of differentiability. In this way, differentiation becomes a part of algebra of functions.

See also

Related Research Articles

In mathematics, an integral domain is a nonzero commutative ring in which the product of any two nonzero elements is nonzero. Integral domains are generalizations of the ring of integers and provide a natural setting for studying divisibility. In an integral domain, every nonzero element a has the cancellation property, that is, if a ≠ 0, an equality ab = ac implies b = c.

In mathematics, a polynomial is a mathematical expression consisting of indeterminates and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An example of a polynomial of a single indeterminate x is x2 − 4x + 7. An example with three indeterminates is x3 + 2xyz2yz + 1.

A finite difference is a mathematical expression of the form f (x + b) − f (x + a). If a finite difference is divided by ba, one gets a difference quotient. The approximation of derivatives by finite differences plays a central role in finite difference methods for the numerical solution of differential equations, especially boundary value problems.

In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist. Informally, a ring is a set equipped with two binary operations satisfying properties analogous to those of addition and multiplication of integers. Ring elements may be numbers such as integers or complex numbers, but they may also be non-numerical objects such as polynomials, square matrices, functions, and power series.

In mathematics, a formal series is an infinite sum that is considered independently from any notion of convergence, and can be manipulated with the usual algebraic operations on series.

In mathematics, in particular abstract algebra, a graded ring is a ring such that the underlying additive group is a direct sum of abelian groups such that . The index set is usually the set of nonnegative integers or the set of integers, but can be any monoid. The direct sum decomposition is usually referred to as gradation or grading.

Bézout's theorem is a statement in algebraic geometry concerning the number of common zeros of n polynomials in n indeterminates. In its original form the theorem states that in general the number of common zeros equals the product of the degrees of the polynomials. It is named after Étienne Bézout.

<span class="mw-page-title-main">Differential operator</span> Typically linear operator defined in terms of differentiation of functions

In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function.

In abstract algebra, a semiring is an algebraic structure. It is a generalization of a ring, dropping the requirement that each element must have an additive inverse. At the same time, it is a generalization of bounded distributive lattices.

In mathematics, especially in the field of algebra, a polynomial ring or polynomial algebra is a ring formed from the set of polynomials in one or more indeterminates with coefficients in another ring, often a field.

In mathematics, a linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form

In mathematics, and especially in algebraic geometry, the intersection number generalizes the intuitive notion of counting the number of times two curves intersect to higher dimensions, multiple curves, and accounting properly for tangency. One needs a definition of intersection number in order to state results like Bézout's theorem.

In mathematics, Kähler differentials provide an adaptation of differential forms to arbitrary commutative rings or schemes. The notion was introduced by Erich Kähler in the 1930s. It was adopted as standard in commutative algebra and algebraic geometry somewhat later, once the need was felt to adapt methods from calculus and geometry over the complex numbers to contexts where such methods are not available.

In mathematics, a derivation is a function on an algebra that generalizes certain features of the derivative operator. Specifically, given an algebra A over a ring or a field K, a K-derivation is a K-linear map D : AA that satisfies Leibniz's law:

In mathematics, differential algebra is, broadly speaking, the area of mathematics consisting in the study of differential equations and differential operators as algebraic objects in view of deriving properties of differential equations and operators without computing the solutions, similarly as polynomial algebras are used for the study of algebraic varieties, which are solution sets of systems of polynomial equations. Weyl algebras and Lie algebras may be considered as belonging to differential algebra.

In mathematics, the derivative is a fundamental construction of differential calculus and admits many possible generalizations within the fields of mathematical analysis, combinatorics, algebra, geometry, etc.

In mathematics, the resultant of two polynomials is a polynomial expression of their coefficients that is equal to zero if and only if the polynomials have a common root, or, equivalently, a common factor. In some older texts, the resultant is also called the eliminant.

In mathematics, the multiplicity of a member of a multiset is the number of times it appears in the multiset. For example, the number of times a given polynomial has a root at a given point is the multiplicity of that root.

<span class="mw-page-title-main">Abstract algebra</span> Branch of mathematics

In mathematics, more specifically algebra, abstract algebra or modern algebra is the study of algebraic structures. Algebraic structures include groups, rings, fields, modules, vector spaces, lattices, and algebras over a field. The term abstract algebra was coined in the early 20th century to distinguish it from older parts of algebra, and more specifically from elementary algebra, the use of variables to represent numbers in computation and reasoning. The abstract perspective on algebra has become so fundamental to advanced mathematics that it is simply called "algebra", while the term "abstract algebra" is seldom used except in pedagogy.

Most of the terms listed in Wikipedia glossaries are already defined and explained within Wikipedia itself. However, glossaries like this one are useful for looking up, comparing and reviewing large numbers of terms together. You can help enhance this page by adding new terms or writing definitions for existing ones.

References

  1. John B. Fraleigh; Victor J. Katz (2002). A First Course in Abstract Algebra. Pearson. p. 443.

Sources