Convergence proof techniques

Last updated

Convergence proof techniques are canonical components of mathematical proofs that sequences or functions converge to a finite limit when the argument tends to infinity.

Contents

There are many types of series and modes of convergence requiring different techniques. Below are some of the more common examples. This article is intended as an introduction aimed to help practitioners explore appropriate techniques. The links below give details of necessary conditions and generalizations to more abstract settings. The convergence of series is already covered in the article on convergence tests.

Convergence in Rn

It is common to want to prove convergence of a sequence or function , where and refer to the natural numbers and the real numbers, and convergence is with respect to the Euclidean norm, .

Useful approaches for this are as follows.

First principles

The analytic definition of convergence of to a limit is that [1] for all there exists a such for all , . The most basic proof technique is to find such a and prove the required inequality. If the value of is not known in advance, the techniques below may be useful.

Contraction mappings

In many cases, the function whose convergence is of interest has the form for some transformation . For example, could map to for some conformable matrix . Alternatively, may be an element-wise operation, such as replacing each element of by the square root of its magnitude.

In such cases, if the problem satisfies the conditions of Banach fixed-point theorem (the domain is a non-empty complete metric space) then it is sufficient to prove that for some constant which is fixed for all and . Such a is called a contraction mapping. The composition of two contraction mappings is a contraction mapping, so if , then it is sufficient to show that and are both contraction mappings.

Example

Famous example of the use of this approach include

  • If has the form for some matrices and , then convergence to occurs if the magnitudes of all eigenvalues of are less than 1[ citation needed ].

Non-expansion mappings

If both above inequalities are weak (), the mapping is a non-expansion mapping. It is not sufficient for to be a non-expansion mapping. For example, is a non-expansion mapping, but the sequence does not converge. However, the composition of a contraction mapping and a non-expansion mapping (or vice versa) is a contraction mapping.

Contraction mappings on limited domains

If is not a contraction mapping on its entire domain, but it is on its codomain (the image of the domain), that is also sufficient for convergence. This also applies for decompositions. For example, consider . The function is not a contraction mapping, but it is on the restricted domain , which is the codomain of for real arguments. Since is a non-expansion mapping, this implies is a contraction mapping.

Convergent subsequences

Every bounded sequence in has a convergent subsequence, by the Bolzano–Weierstrass theorem. If these all have the same limit, then the original sequence converges to that limit. If it can be shown that all of the subsequences of have the same limit, such as by showing that there is a unique fixed point of the transformation , then the initial sequence must also converge to that limit.

Monotonicity (Lyapunov functions)

Every bounded monotonic sequence in converges to a limit.

This approach can also be applied to sequences that are not monotonic. Instead, it is possible to define a function such that is monotonic in . If the satisfies the conditions to be a Lyapunov function then is convergent. Lyapunov's theorem is normally stated for ordinary differential equations, but can also be applied to sequences of iterates by replacing derivatives with discrete differences.

The basic requirements on are that

  1. for and (or for )
  2. for all and
  3. be "radially unbounded", so that goes to infinity for any sequence with that tends to infinity.

In many cases, a Lyapunov function of the form can be found, although more complex forms are also used.

For delay differential equations, a similar approach applies with Lyapunov functions replaced by Lyapunov functionals also called Lyapunov-Krasovskii functionals.

If the inequality in the condition 1 is weak, LaSalle's invariance principle may be used.

Convergence of sequences of functions

To consider the convergence of sequences of functions, [2] it is necessary to define a distance between functions to replace the Euclidean norm. These often include

See also

Convergence of random variables

Random variables [3] are more complicated than simple elements of . (Formally, a random variable is a mapping from an event space to a value space . The value space may be , such as the roll of a dice, and such a random variable is often spoken of informally as being in , but convergence of sequence of random variables corresponds to convergence of the sequence of functions, or the distributions, rather than the sequence of values.)

There are multiple types of convergence, depending on how the distance between functions is measured.

Each has its own proof techniques, which are beyond the current scope of this article.

See also

Topological convergence

For all of the above techniques, some form the basic analytic definition of convergence above applies. However, topology has its own definition of convergence. For example, in a non-hausdorff space, it is possible for a sequence to converge to multiple different limits.

Related Research Articles

In mathematics, more specifically in functional analysis, a Banach space is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between vectors and is complete in the sense that a Cauchy sequence of vectors always converges to a well-defined limit that is within the space.

In mathematics, the branch of real analysis studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.

In probability theory, the central limit theorem (CLT) states that, under appropriate conditions, the distribution of a normalized version of the sample mean converges to a standard normal distribution. This holds even if the original variables themselves are not normally distributed. There are several versions of the CLT, each applying in the context of different conditions.

<span class="mw-page-title-main">Uniform convergence</span> Mode of convergence of a function sequence

In the mathematical field of analysis, uniform convergence is a mode of convergence of functions stronger than pointwise convergence. A sequence of functions converges uniformly to a limiting function on a set as the function domain if, given any arbitrarily small positive number , a number can be found such that each of the functions differs from by no more than at every pointin. Described in an informal way, if converges to uniformly, then how quickly the functions approach is "uniform" throughout in the following sense: in order to guarantee that differs from by less than a chosen distance , we only need to make sure that is larger than or equal to a certain , which we can find without knowing the value of in advance. In other words, there exists a number that could depend on but is independent of , such that choosing will ensure that for all . In contrast, pointwise convergence of to merely guarantees that for any given in advance, we can find such that, for that particular, falls within of whenever .

In probability theory, there exist several different notions of convergence of sequences of random variables, including convergence in probability, convergence in distribution, and almost sure convergence. The different notions of convergence capture different properties about the sequence, with some notions of convergence being stronger than others. For example, convergence in distribution tells us about the limit distribution of a sequence of random variables. This is a weaker notion than convergence in probability, which tells us about the value a random variable will take, rather than just the distribution.

In functional analysis and related areas of mathematics, Fréchet spaces, named after Maurice Fréchet, are special topological vector spaces. They are generalizations of Banach spaces. All Banach and Hilbert spaces are Fréchet spaces. Spaces of infinitely differentiable functions are typical examples of Fréchet spaces, many of which are typically not Banach spaces.

In mathematics, the uniform boundedness principle or Banach–Steinhaus theorem is one of the fundamental results in functional analysis. Together with the Hahn–Banach theorem and the open mapping theorem, it is considered one of the cornerstones of the field. In its basic form, it asserts that for a family of continuous linear operators whose domain is a Banach space, pointwise boundedness is equivalent to uniform boundedness in operator norm.

<span class="mw-page-title-main">Limit of a sequence</span> Value to which tends an infinite sequence

In mathematics, the limit of a sequence is the value that the terms of a sequence "tend to", and is often denoted using the symbol. If such a limit exists, the sequence is called convergent. A sequence that does not converge is said to be divergent. The limit of a sequence is said to be the fundamental notion on which the whole of mathematical analysis ultimately rests.

In mathematics, Fatou's lemma establishes an inequality relating the Lebesgue integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions. The lemma is named after Pierre Fatou.

In mathematics, a function space is a set of functions between two fixed sets. Often, the domain and/or codomain will have additional structure which is inherited by the function space. For example, the set of functions from any set X into a vector space has a natural vector space structure given by pointwise addition and scalar multiplication. In other scenarios, the function space might inherit a topological or metric structure, hence the name function space.

In measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which almost everywhere convergence of a sequence of functions implies convergence in the L1 norm. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration.

The Arzelà–Ascoli theorem is a fundamental result of mathematical analysis giving necessary and sufficient conditions to decide whether every sequence of a given family of real-valued continuous functions defined on a closed and bounded interval has a uniformly convergent subsequence. The main condition is the equicontinuity of the family of functions. The theorem is the basis of many proofs in mathematics, including that of the Peano existence theorem in the theory of ordinary differential equations, Montel's theorem in complex analysis, and the Peter–Weyl theorem in harmonic analysis and various results concerning compactness of integral operators.

In mathematics, a divergent series is an infinite series that is not convergent, meaning that the infinite sequence of the partial sums of the series does not have a finite limit.

In probability theory, Lévy’s continuity theorem, or Lévy's convergence theorem, named after the French mathematician Paul Lévy, connects convergence in distribution of the sequence of random variables with pointwise convergence of their characteristic functions. This theorem is the basis for one approach to prove the central limit theorem and is one of the major theorems concerning characteristic functions.

In probability theory, an empirical process is a stochastic process that characterizes the deviation of the empirical distribution function from its expectation. In mean field theory, limit theorems are considered and generalise the central limit theorem for empirical measures. Applications of the theory of empirical processes arise in non-parametric statistics.

In the mathematical field of analysis, Dini's theorem says that if a monotone sequence of continuous functions converges pointwise on a compact space and if the limit function is also continuous, then the convergence is uniform.

In probability theory, the continuous mapping theorem states that continuous functions preserve limits even if their arguments are sequences of random variables. A continuous function, in Heine's definition, is such a function that maps convergent sequences into convergent sequences: if xnx then g(xn) → g(x). The continuous mapping theorem states that this will also be true if we replace the deterministic sequence {xn} with a sequence of random variables {Xn}, and replace the standard notion of convergence of real numbers “→” with one of the types of convergence of random variables.

In probability theory, Lindeberg's condition is a sufficient condition for the central limit theorem (CLT) to hold for a sequence of independent random variables. Unlike the classical CLT, which requires that the random variables in question have finite variance and be both independent and identically distributed, Lindeberg's CLT only requires that they have finite variance, satisfy Lindeberg's condition, and be independent. It is named after the Finnish mathematician Jarl Waldemar Lindeberg.

In mathematics, a limit is the value that a function approaches as the input approaches some value. Limits are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals.

<span class="mw-page-title-main">Uniform limit theorem</span>

In mathematics, the uniform limit theorem states that the uniform limit of any sequence of continuous functions is continuous.

References

  1. Ross, Kenneth. Elementary Analysis: The Theory of Calculus. Springer.
  2. Haase, Markus. Functional Analysis: An Elementary Introduction. American Mathematics Society.
  3. Billingsley, Patrick (1995). Probability and Measure. John Wesley.