Part of a series of articles about |
Calculus |
---|
In mathematical analysis, the alternating series test is the method used to show that an alternating series is convergent when its terms (1) decrease in absolute value, and (2) approach zero in the limit. The test was used by Gottfried Leibniz and is sometimes known as Leibniz's test, Leibniz's rule, or the Leibniz criterion. The test is only sufficient, not necessary, so some convergent alternating series may fail the first part of the test. [1] [2] [3]
For a generalization, see Dirichlet's test. [4] [5] [6]
A series of the form
where either all an are positive or all an are negative, is called an alternating series.
The alternating series test guarantees that an alternating series converges if the following two conditions are met: [1] [2] [3]
Moreover, let L denote the sum of the series, then the partial sum approximates L with error bounded by the next omitted term:
Suppose we are given a series of the form , where and for all natural numbers n. (The case follows by taking the negative.) [8]
We will prove that both the partial sums with odd number of terms, and with even number of terms, converge to the same number L. Thus the usual partial sum also converges to L.
The odd partial sums decrease monotonically:
while the even partial sums increase monotonically:
both because an decreases monotonically with n.
Moreover, since an are positive, . Thus we can collect these facts to form the following suggestive inequality:
Now, note that a1−a2 is a lower bound of the monotonically decreasing sequence S2m+1, the monotone convergence theorem then implies that this sequence converges as m approaches infinity. Similarly, the sequence of even partial sum converges too.
Finally, they must converge to the same number because .
Call the limit L, then the monotone convergence theorem also tells us extra information that
for any m. This means the partial sums of an alternating series also "alternates" above and below the final limit. More precisely, when there is an odd (even) number of terms, i.e. the last term is a plus (minus) term, then the partial sum is above (below) the final limit.
This understanding leads immediately to an error bound of partial sums, shown below.
We would like to show by splitting into two cases.
When k = 2m+1, i.e. odd, then
When k = 2m, i.e. even, then
as desired.
Both cases rely essentially on the last inequality derived in the previous proof.
The alternating harmonic series
meets both conditions for the alternating series test and converges.
All of the conditions in the test, namely convergence to zero and monotonicity, should be met in order for the conclusion to be true. For example, take the series
The signs are alternating and the terms tend to zero. However, monotonicity is not present and we cannot apply the test. Actually, the series is divergent. Indeed, for the partial sum we have which is twice the partial sum of the harmonic series, which is divergent. Hence the original series is divergent.
Leibniz test's monotonicity is not a necessary condition, thus the test itself is only sufficient, but not necessary. (The second part of the test is well known necessary condition of convergence for all series.)
Examples of nonmonotonic series that converge are:
In fact, for every monotonic series it is possible to obtain an infinite number of nonmonotonic series that converge to the same sum by permuting its terms with permutations satisfying the condition in Agnew's theorem. [9]
In mathematics, a geometric series is a series in which the ratio of successive adjacent terms is constant. In other words, the sum of consecutive terms of a geometric sequence forms a geometric series. Each term is therefore the geometric mean of its two neighbouring terms, similar to how the terms in an arithmetic series are the arithmetic means of their two neighbouring terms.
In mathematics, a series is, roughly speaking, an addition of infinitely many terms, one after the other. The study of series is a major part of calculus and its generalization, mathematical analysis. Series are used in most areas of mathematics, even for studying finite structures in combinatorics through generating functions. The mathematical properties of infinite series make them widely applicable in other quantitative disciplines such as physics, computer science, statistics and finance.
In mathematics, the branch of real analysis studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.
In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members. The number of elements is called the length of the sequence. Unlike a set, the same elements can appear multiple times at different positions in a sequence, and unlike a set, the order does matter. Formally, a sequence can be defined as a function from natural numbers to the elements at each position. The notion of a sequence can be generalized to an indexed family, defined as a function from an arbitrary index set.
In calculus, Taylor's theorem gives an approximation of a -times differentiable function around a given point by a polynomial of degree , called the -th-order Taylor polynomial. For a smooth function, the Taylor polynomial is the truncation at the order of the Taylor series of the function. The first-order Taylor polynomial is the linear approximation of the function, and the second-order Taylor polynomial is often referred to as the quadratic approximation. There are several versions of Taylor's theorem, some giving explicit estimates of the approximation error of the function by its Taylor polynomial.
In mathematics, an infinite series of numbers is said to converge absolutely if the sum of the absolute values of the summands is finite. More precisely, a real or complex series is said to converge absolutely if for some real number Similarly, an improper integral of a function, is said to converge absolutely if the integral of the absolute value of the integrand is finite—that is, if A convergent series that is not absolutely convergent is called conditionally convergent.
In the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the good convergence behaviour of monotonic sequences, i.e. sequences that are non-increasing, or non-decreasing. In its simplest form, it says that a non-decreasing bounded-above sequence of real numbers converges to its smallest upper bound, its supremum. Likewise, a non-increasing bounded-below sequence converges to its largest lower bound, its infimum. In particular, infinite sums of non-negative numbers converge to the supremum of the partial sums if and only if the partial sums are bounded.
In mathematics, the ratio test is a test for the convergence of a series
In mathematics, the integral test for convergence is a method used to test infinite series of monotonic terms for convergence. It was developed by Colin Maclaurin and Augustin-Louis Cauchy and is sometimes known as the Maclaurin–Cauchy test.
In mathematics, an alternating series is an infinite series of terms that alternate between positive and negative signs. In capital-sigma notation this is expressed or with an > 0 for all n.
In mathematics, a telescoping series is a series whose general term is of the form , i.e. the difference of two consecutive terms of a sequence . As a consequence the partial sums of the series only consists of two terms of after cancellation.
In mathematics, more specifically in mathematical analysis, the Cauchy product is the discrete convolution of two infinite series. It is named after the French mathematician Augustin-Louis Cauchy.
In mathematics, the Leibniz formula for π, named after Gottfried Wilhelm Leibniz, states that
In mathematics, the root test is a criterion for the convergence of an infinite series. It depends on the quantity
In mathematics, a series is the sum of the terms of an infinite sequence of numbers. More precisely, an infinite sequence defines a series S that is denoted
In mathematics, the Riemann series theorem, also called the Riemann rearrangement theorem, named after 19th-century German mathematician Bernhard Riemann, says that if an infinite series of real numbers is conditionally convergent, then its terms can be arranged in a permutation so that the new series converges to an arbitrary real number, and rearranged such that the new series diverges. This implies that a series of real numbers is absolutely convergent if and only if it is unconditionally convergent.
In mathematics, Dirichlet's test is a method of testing for the convergence of a series that is especially useful for proving conditional convergence. It is named after its author Peter Gustav Lejeune Dirichlet, and was published posthumously in the Journal de Mathématiques Pures et Appliquées in 1862.
In mathematics, convergence tests are methods of testing for the convergence, conditional convergence, absolute convergence, interval of convergence or divergence of an infinite series .
In mathematics, the arctangent series, traditionally called Gregory's series, is the Taylor series expansion at the origin of the arctangent function:
Most of the terms listed in Wikipedia glossaries are already defined and explained within Wikipedia itself. However, glossaries like this one are useful for looking up, comparing and reviewing large numbers of terms together. You can help enhance this page by adding new terms or writing definitions for existing ones.