WikiMili The Free Encyclopedia

In mathematics, **Descartes' rule of signs**, first described by René Descartes in his work * La Géométrie *, is a technique for getting information on the number of positive real roots of a polynomial. It asserts that the number of positive roots is at most the number of sign changes in the sequence of polynomial's coefficients (omitting the zero coefficients), and that the difference between these two numbers is always even. This implies in particular that, if this difference is zero or one, then there is exactly zero or one positive root, respectively.

**Mathematics** includes the study of such topics as quantity, structure, space, and change.

**René Descartes** was a French philosopher, mathematician, and scientist. A native of the Kingdom of France, he spent about 20 years (1629–1649) of his life in the Dutch Republic after serving for a while in the Dutch States Army of Maurice of Nassau, Prince of Orange and the Stadtholder of the United Provinces. He is generally considered one of the most notable intellectual figures of the Dutch Golden Age.

* La Géométrie* was published in 1637 as an appendix to

- Descartes' rule of signs
- Positive roots
- Negative roots
- Example: real roots
- Nonreal roots
- Example: zero coefficients, nonreal roots
- Special case
- Generalizations
- See also
- Notes
- External links

By a homographic transformation of the variable, one may use Descartes' rule of signs for getting a similar information on the number of roots in any interval. This is the basic idea of Budan's theorem and Budan–Fourier theorem. By repeating the division of an interval into two intervals, one gets eventually a list of disjoints intervals containing together all real roots of the polynomial, and containing each exactly one real root. Descartes rule of signs and homographic transformations of the variable are, nowadays, the basis of the fastest algorithms for computer computation of real roots of polynomials (see Real-root isolation).

In mathematics, **Budan's theorem** is a theorem for bounding the number of real roots of a polynomial in an interval, and computing the parity of this number. It was published in 1807 by François Budan de Boislaurent.

In mathematics, and, more specifically in numerical analysis and computer algebra, **real-root isolation** of a polynomial consist of producing disjoint intervals of the real line, which contain each one real root of the polynomial, and, together, contain all the real roots of the polynomial.

Descartes himself used the transformation *x* → –*x* for using his rule for getting information of the number of negative roots.

The rule states that if the terms of a single-variable polynomial with real coefficients are ordered by descending variable exponent, then the number of positive roots of the polynomial is either equal to the number of sign differences between consecutive nonzero coefficients, or is less than it by an even number. Multiple roots of the same value are counted separately.

In mathematics, a **polynomial** is an expression consisting of variables and coefficients, that involves only the operations of addition, subtraction, multiplication, and non-negative integer exponents of variables. An example of a polynomial of a single indeterminate, *x*, is *x*^{2} − 4*x* + 7. An example in three variables is *x*^{3} + 2*xyz*^{2} − *yz* + 1.

In mathematics, a **real number** is a value of a continuous quantity that can represent a distance along a line. The adjective *real* in this context was introduced in the 17th century by René Descartes, who distinguished between real and imaginary roots of polynomials. The real numbers include all the rational numbers, such as the integer −5 and the fraction 4/3, and all the irrational numbers, such as √2. Included within the irrationals are the transcendental numbers, such as π (3.14159265...). In addition to measuring distance, real numbers can be used to measure quantities such as time, mass, energy, velocity, and many more.

In mathematics, a **coefficient** is a multiplicative factor in some term of a polynomial, a series, or any expression; it is usually a number, but may be any expression. In the latter case, the variables appearing in the coefficients are often called parameters, and must be clearly distinguished from the other variables.

As a corollary of the rule, the number of negative roots is the number of sign changes after multiplying the coefficients of odd-power terms by −1, or fewer than it by an even number. This procedure is equivalent to substituting the negation of the variable for the variable itself. For example, to find the number of negative roots of , we equivalently ask how many positive roots there are for in

A **corollary** is a statement that follows readily from a previous statement.

Using Descartes' rule of signs on gives the number of positive roots of *g*, and since it gives the number of positive roots of *f*, which is the same as the number of negative roots of *f*.

The polynomial

has one sign change between the second and third terms (the sequence of pairs of successive signs is + → +, + → −, − → − ). Therefore it has exactly one positive root. Note that the sign of the leading coefficient needs to be considered. To find the number of negative roots, change the signs of the coefficients of the terms with odd exponents, i.e., apply Descartes' rule of signs to the polynomial , to obtain a second polynomial

This polynomial has two sign changes (the sequence of pairs of successive signs is − → +, + → +, + → − ), meaning that this second polynomial has two or zero positive roots; thus the original polynomial has two or zero negative roots.

In fact, the factorization of the first polynomial is

so the roots are −1 (twice) and +1 (once).

The factorization of the second polynomial is

So here, the roots are +1 (twice) and −1 (once), the negation of the roots of the original polynomial.

Any *n*th degree polynomial has exactly *n* roots in the complex plane, if counted according to multiplicity. So if *f*(*x*) is a polynomial which does not have a root at 0 (which can be determined by inspection) then the __minimum__ number of nonreal roots is equal to

In mathematics, the **complex plane** or ** z-plane** is a geometric representation of the complex numbers established by the

where *p* denotes the maximum number of positive roots, *q* denotes the maximum number of negative roots (both of which can be found using Descartes' rule of signs), and *n* denotes the degree of the equation.

The polynomial

has one sign change, so the maximum number of positive real roots is 1. From

we can tell that the polynomial has no negative real roots. So the minimum number of nonreal roots is

Since nonreal roots of a polynomial with real coefficients must occur in conjugate pairs, we can see that *x*^{3} − 1 has exactly 2 nonreal roots and 1 real (and positive) root.

The subtraction of only multiples of 2 from the maximal number of positive roots occurs because the polynomial may have nonreal roots, which always come in pairs since the rule applies to polynomials whose coefficients are real. Thus if the polynomial is known to have all real roots, this rule allows one to find the exact number of positive and negative roots. Since it is easy to determine the multiplicity of zero as a root, the sign of all roots can be determined in this case.

If the real polynomial *P* has *k* real positive roots counted with multiplicity, then for every *a* > 0 there are at least *k* changes of sign in the sequence of coefficients of the Taylor series of the function *e*^{ax}*P*(*x*). For *a* sufficiently large, there are exactly *k* such changes of sign.^{ [1] }^{ [2] }

In the 1970s Askold Georgevich Khovanskiǐ developed the theory of * fewnomials * that generalises Descartes' rule.^{ [3] } The rule of signs can be thought of as stating that the number of real roots of a polynomial is dependent on the polynomial's complexity, and that this complexity is proportional to the number of monomials it has, not its degree. Khovanskiǐ showed that this holds true not just for polynomials but for algebraic combinations of many transcendental functions, the so-called Pfaffian functions.

- ↑ D. R. Curtiss,
*Recent extensions of Descartes' rule of signs*, Annals of Mathematics., Vol. 19, No. 4, 1918, pp. 251–278. - ↑ Vladimir P. Kostov,
*A mapping defined by the Schur–Szegő composition*, Comptes Rendus Acad. Bulg. Sci. tome 63, No. 7, 2010, pp. 943–952. - ↑ Khovanskiǐ, A.G. (1991).
*Fewnomials*. Translations of Mathematical Monographs. Translated from the Russian by Smilka Zdravkovska. Providence, RI: American Mathematical Society. p. 88. ISBN 0-8218-4547-0. Zbl 0728.12002.

*This article incorporates material from Descartes' rule of signs on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.*

- Descartes' Rule of Signs – Proof of the rule
- Descartes' Rule of Signs – Basic explanation

In algebra, a **quadratic equation** is any equation having the form

In algebra, the **rational root theorem** states a constraint on rational solutions of a polynomial equation

In mathematics, a **Hurwitz polynomial**, named after Adolf Hurwitz, is a polynomial whose roots (zeros) are located in the left half-plane of the complex plane or on the imaginary axis, that is, the real part of every root is zero or negative. Such a polynomial must have coefficients that are positive real numbers. The term is sometimes restricted to polynomials whose roots have real parts that are strictly negative, excluding the axis.

The **fundamental theorem of algebra** states that every non-constant single-variable polynomial with complex coefficients has at least one complex root. This includes polynomials with real coefficients, since every real number is a complex number with an imaginary part equal to zero.

In algebra, the **discriminant** of a polynomial is a polynomial function of its coefficients, which allows deducing some properties of the roots without computing them. For example, the discriminant of the quadratic polynomial

In mathematics and computing, a **root-finding algorithm** is an algorithm for finding roots of continuous functions. A root of a function *f*, from the real numbers to real numbers or from the complex numbers to the complex numbers, is a number *x* such that *f*(*x*) = 0. As, generally, the roots of a function cannot be computed exactly, nor expressed in closed form, root-finding algorithms provide approximations to roots, expressed either as floating point numbers or as small isolating intervals, or disks for complex roots.

In mathematics, an **irreducible polynomial** is, roughly speaking, a non-constant polynomial that cannot be factored into the product of two non-constant polynomials. The property of irreducibility depends on the nature of the coefficients that are accepted for the possible factors, that is, the field or ring to which the coefficients of the polynomial and its possible factors are supposed to belong. For example, the polynomial *x*^{2} − 2 is a polynomial with integer coefficients, but, as every integer is also a real number, it is also a polynomial with real coefficients. It is irreducible if it is considered as a polynomial with integer coefficients, but it factors as if it is considered as a polynomial with real coefficients. One says that the polynomial *x*^{2} − 2 is irreducible over the integers but not over the reals.

In mathematics, a **zero**, also sometimes called a **root**, of a real-, complex- or generally vector-valued function is a member of the domain of such that *vanishes* at ; that is, is a solution of the equation . In other words, a "zero" of a function is an input value that produces an output of .

In algebra, a **quartic function** is a function of the form

In mathematics, the **bisection method** is a root-finding method that applies to any continuous functions for which one knows two values with opposite signs. The method consists of repeatedly bisecting the interval defined by these values and then selecting the subinterval in which the function changes sign, and therefore must contain a root. It is a very simple and robust method, but it is also relatively slow. Because of this, it is often used to obtain a rough approximation to a solution which is then used as a starting point for more rapidly converging methods. The method is also called the **interval halving** method, the **binary search method**, or the **dichotomy method**.

In mathematics, the **Sturm sequence** of a univariate polynomial p is a sequence of polynomials associated with p and its derivative by a variant of Euclid's algorithm for polynomials. **Sturm's theorem** expresses the number of distinct real roots of p located in an interval in terms of the number of changes of signs of the values of the Sturm sequence at the bounds of the interval. Applied to the interval of all the real numbers, it gives the total number of real roots of p.

In control system theory, the **Routh–Hurwitz stability criterion** is a mathematical test that is a necessary and sufficient condition for the stability of a linear time invariant (LTI) control system. The Routh test is an efficient recursive algorithm that English mathematician Edward John Routh proposed in 1876 to determine whether all the roots of the characteristic polynomial of a linear system have negative real parts. German mathematician Adolf Hurwitz independently proposed in 1895 to arrange the coefficients of the polynomial into a square matrix, called the Hurwitz matrix, and showed that the polynomial is stable if and only if the sequence of determinants of its principal submatrices are all positive. The two procedures are equivalent, with the Routh test providing a more efficient way to compute the Hurwitz determinants than computing them directly. A polynomial satisfying the Routh–Hurwitz criterion is called a Hurwitz polynomial.

In mathematics, the **splitting circle method** is a numerical algorithm for the numerical factorization of a polynomial and, ultimately, for finding its complex roots. It was introduced by Arnold Schönhage in his 1982 paper *The fundamental theorem of algebra in terms of computational complexity*. A revised algorithm was presented by Victor Pan in 1998. An implementation was provided by Xavier Gourdon in 1996 for the Magma and PARI/GP computer algebra systems.

In mathematics, a **univariate polynomial** of degree n with real or complex coefficients has n complex **roots**, if counted with their multiplicities. They form a set of n points in the complex plane. This article concerns the **geometry** of these points, that is the information about their localization in the complex plane that can be deduced from the degree and the coefficients of the polynomial.

In complex analysis, a branch of mathematics, the **Gauss–Lucas theorem** gives a geometrical relation between the roots of a polynomial *P* and the roots of its derivative *P′*. The set of roots of a real or complex polynomial is a set of points in the complex plane. The theorem states that the roots of *P′* all lie within the convex hull of the roots of *P*, that is the smallest convex polygon containing the roots of *P*. When *P* has a single root then this convex hull is a single point and when the roots lie on a line then the convex hull is a segment of this line. The Gauss–Lucas theorem, named after Carl Friedrich Gauss and Félix Lucas, is similar in spirit to Rolle's theorem.

In mathematics, the **multiplicity** of a member of a multiset is the number of times it appears in the multiset. For example, the number of times a given polynomial equation has a root at a given point is the multiplicity of that root.

A **system of polynomial equations** is a set of simultaneous equations *f*_{1} = 0, ..., *f*_{h} = 0 where the *f*_{i} are polynomials in several variables, say *x*_{1}, ..., *x*_{n}, over some field *k*.

This page is based on this Wikipedia article

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.