Uniform convergence

Last updated
A sequence of functions
(
f
n
)
{\displaystyle (f_{n})}
converges uniformly to
f
{\displaystyle f}
when for arbitrary small
[?]
{\displaystyle \epsilon }
there is an index
N
{\displaystyle N}
such that the graph of
f
n
{\displaystyle f_{n}}
is in the
[?]
{\displaystyle \epsilon }
-tube around f whenever
n
>=
N
.
{\displaystyle n\geq N.} Uniform convergence.svg
A sequence of functions converges uniformly to when for arbitrary small there is an index such that the graph of is in the -tube around f whenever
The limit of a sequence of continuous functions does not have to be continuous: the sequence of functions
f
n
(
x
)
=
sin
n
[?]
(
x
)
{\displaystyle f_{n}(x)=\sin ^{n}(x)}
(marked in green and blue) converges pointwise over the entire domain, but the limit function is discontinuous (marked in red). Drini-nonuniformconvergence.png
The limit of a sequence of continuous functions does not have to be continuous: the sequence of functions (marked in green and blue) converges pointwise over the entire domain, but the limit function is discontinuous (marked in red).

In the mathematical field of analysis, uniform convergence is a mode of convergence of functions stronger than pointwise convergence. A sequence of functions converges uniformly to a limiting function on a set as the function domain if, given any arbitrarily small positive number , a number can be found such that each of the functions differs from by no more than at every pointin. Described in an informal way, if converges to uniformly, then how quickly the functions approach is "uniform" throughout in the following sense: in order to guarantee that differs from by less than a chosen distance , we only need to make sure that is larger than or equal to a certain , which we can find without knowing the value of in advance. In other words, there exists a number that could depend on but is independent of , such that choosing will ensure that for all . In contrast, pointwise convergence of to merely guarantees that for any given in advance, we can find (i.e., could depend on the values of both and) such that, for that particular, falls within of whenever (and a different may require a different, larger for to guarantee that ).

Contents

The difference between uniform convergence and pointwise convergence was not fully appreciated early in the history of calculus, leading to instances of faulty reasoning. The concept, which was first formalized by Karl Weierstrass, is important because several properties of the functions , such as continuity, Riemann integrability, and, with additional hypotheses, differentiability, are transferred to the limit if the convergence is uniform, but not necessarily if the convergence is not uniform.

History

In 1821 Augustin-Louis Cauchy published a proof that a convergent sum of continuous functions is always continuous, to which Niels Henrik Abel in 1826 found purported counterexamples in the context of Fourier series, arguing that Cauchy's proof had to be incorrect. Completely standard notions of convergence did not exist at the time, and Cauchy handled convergence using infinitesimal methods. When put into the modern language, what Cauchy proved is that a uniformly convergent sequence of continuous functions has a continuous limit. The failure of a merely pointwise-convergent limit of continuous functions to converge to a continuous function illustrates the importance of distinguishing between different types of convergence when handling sequences of functions. [1]

The term uniform convergence was probably first used by Christoph Gudermann, in an 1838 paper on elliptic functions, where he employed the phrase "convergence in a uniform way" when the "mode of convergence" of a series is independent of the variables and While he thought it a "remarkable fact" when a series converged in this way, he did not give a formal definition, nor use the property in any of his proofs. [2]

Later Gudermann's pupil Karl Weierstrass, who attended his course on elliptic functions in 1839–1840, coined the term gleichmäßig konvergent (German : uniformly convergent) which he used in his 1841 paper Zur Theorie der Potenzreihen, published in 1894. Independently, similar concepts were articulated by Philipp Ludwig von Seidel [3] and George Gabriel Stokes. G. H. Hardy compares the three definitions in his paper "Sir George Stokes and the concept of uniform convergence" and remarks: "Weierstrass's discovery was the earliest, and he alone fully realized its far-reaching importance as one of the fundamental ideas of analysis."

Under the influence of Weierstrass and Bernhard Riemann this concept and related questions were intensely studied at the end of the 19th century by Hermann Hankel, Paul du Bois-Reymond, Ulisse Dini, Cesare Arzelà and others.

Definition

We first define uniform convergence for real-valued functions, although the concept is readily generalized to functions mapping to metric spaces and, more generally, uniform spaces (see below).

Suppose is a set and is a sequence of real-valued functions on it. We say the sequence is uniformly convergent on with limit if for every there exists a natural number such that for all and for all

The notation for uniform convergence of to is not quite standardized and different authors have used a variety of symbols, including (in roughly decreasing order of popularity):

Frequently, no special symbol is used, and authors simply write

to indicate that convergence is uniform. (In contrast, the expression on without an adverb is taken to mean pointwise convergence on : for all , as .)

Since is a complete metric space, the Cauchy criterion can be used to give an equivalent alternative formulation for uniform convergence: converges uniformly on (in the previous sense) if and only if for every , there exists a natural number such that

.

In yet another equivalent formulation, if we define

then converges to uniformly if and only if as . Thus, we can characterize uniform convergence of on as (simple) convergence of in the function space with respect to the uniform metric (also called the supremum metric), defined by

Symbolically,

.

The sequence is said to be locally uniformly convergent with limit if is a metric space and for every , there exists an such that converges uniformly on It is clear that uniform convergence implies local uniform convergence, which implies pointwise convergence.

Notes

Intuitively, a sequence of functions converges uniformly to if, given an arbitrarily small , we can find an so that the functions with all fall within a "tube" of width centered around (i.e., between and ) for the entire domain of the function.

Note that interchanging the order of quantifiers in the definition of uniform convergence by moving "for all " in front of "there exists a natural number " results in a definition of pointwise convergence of the sequence. To make this difference explicit, in the case of uniform convergence, can only depend on , and the choice of has to work for all , for a specific value of that is given. In contrast, in the case of pointwise convergence, may depend on both and , and the choice of only has to work for the specific values of and that are given. Thus uniform convergence implies pointwise convergence, however the converse is not true, as the example in the section below illustrates.

Generalizations

One may straightforwardly extend the concept to functions EM, where (M, d) is a metric space, by replacing with .

The most general setting is the uniform convergence of nets of functions EX, where X is a uniform space. We say that the net converges uniformly with limit f : EX if and only if for every entourage V in X, there exists an , such that for every x in E and every , is in V. In this situation, uniform limit of continuous functions remains continuous.

Definition in a hyperreal setting

Uniform convergence admits a simplified definition in a hyperreal setting. Thus, a sequence converges to f uniformly if for all hyperreal x in the domain of and all infinite n, is infinitely close to (see microcontinuity for a similar definition of uniform continuity). In contrast, pointwise continuity requires this only for real x.

Examples

For , a basic example of uniform convergence can be illustrated as follows: the sequence converges uniformly, while does not. Specifically, assume . Each function is less than or equal to when , regardless of the value of . On the other hand, is only less than or equal to at ever increasing values of when values of are selected closer and closer to 1 (explained more in depth further below).

Given a topological space X, we can equip the space of bounded real or complex-valued functions over X with the uniform norm topology, with the uniform metric defined by

Then uniform convergence simply means convergence in the uniform norm topology:

.

The sequence of functions

is a classic example of a sequence of functions that converges to a function pointwise but not uniformly. To show this, we first observe that the pointwise limit of as is the function , given by

Pointwise convergence: Convergence is trivial for and , since and , for all . For and given , we can ensure that whenever by choosing , which is the minimum integer exponent of that allows it to reach or dip below (here the upper square brackets indicate rounding up, see ceiling function). Hence, pointwise for all . Note that the choice of depends on the value of and . Moreover, for a fixed choice of , (which cannot be defined to be smaller) grows without bound as approaches 1. These observations preclude the possibility of uniform convergence.

Non-uniformity of convergence: The convergence is not uniform, because we can find an so that no matter how large we choose there will be values of and such that To see this, first observe that regardless of how large becomes, there is always an such that Thus, if we choose we can never find an such that for all and . Explicitly, whatever candidate we choose for , consider the value of at . Since

the candidate fails because we have found an example of an that "escaped" our attempt to "confine" each to within of for all . In fact, it is easy to see that

contrary to the requirement that if .

In this example one can easily see that pointwise convergence does not preserve differentiability or continuity. While each function of the sequence is smooth, that is to say that for all n, , the limit is not even continuous.

Exponential function

The series expansion of the exponential function can be shown to be uniformly convergent on any bounded subset using the Weierstrass M-test.

Theorem (Weierstrass M-test).Let be a sequence of functions and let be a sequence of positive real numbers such that for all and If converges, then converges absolutely and uniformly on .

The complex exponential function can be expressed as the series:

Any bounded subset is a subset of some disc of radius centered on the origin in the complex plane. The Weierstrass M-test requires us to find an upper bound on the terms of the series, with independent of the position in the disc:

To do this, we notice

and take

If is convergent, then the M-test asserts that the original series is uniformly convergent.

The ratio test can be used here:

which means the series over is convergent. Thus the original series converges uniformly for all and since , the series is also uniformly convergent on

Properties

Applications

To continuity

Counterexample to a strengthening of the uniform convergence theorem, in which pointwise convergence, rather than uniform convergence, is assumed. The continuous green functions
sin
n
[?]
(
x
)
{\displaystyle \sin ^{n}(x)}
converge to the non-continuous red function. This can happen only if convergence is not uniform. Drini nonuniformconvergence SVG.svg
Counterexample to a strengthening of the uniform convergence theorem, in which pointwise convergence, rather than uniform convergence, is assumed. The continuous green functions converge to the non-continuous red function. This can happen only if convergence is not uniform.

If and are topological spaces, then it makes sense to talk about the continuity of the functions . If we further assume that is a metric space, then (uniform) convergence of the to is also well defined. The following result states that continuity is preserved by uniform convergence:

Uniform limit theorem  Suppose is a topological space, is a metric space, and is a sequence of continuous functions . If on , then is also continuous.

This theorem is proved by the "ε/3 trick", and is the archetypal example of this trick: to prove a given inequality (ε), one uses the definitions of continuity and uniform convergence to produce 3 inequalities (ε/3), and then combines them via the triangle inequality to produce the desired inequality.

Proof

Let be an arbitrary point. We will prove that is continuous at . Let . By uniform convergence, there exists a natural number such that

(uniform convergence shows that the above statement is true for all , but we will only use it for one function of the sequence, namely ).

It follows from the continuity of at that there exists an open set containing such that

.

Hence, using the triangle inequality,

,

which gives us the continuity of at .

This theorem is an important one in the history of real and Fourier analysis, since many 18th century mathematicians had the intuitive understanding that a sequence of continuous functions always converges to a continuous function. The image above shows a counterexample, and many discontinuous functions could, in fact, be written as a Fourier series of continuous functions. The erroneous claim that the pointwise limit of a sequence of continuous functions is continuous (originally stated in terms of convergent series of continuous functions) is infamously known as "Cauchy's wrong theorem". The uniform limit theorem shows that a stronger form of convergence, uniform convergence, is needed to ensure the preservation of continuity in the limit function.

More precisely, this theorem states that the uniform limit of uniformly continuous functions is uniformly continuous; for a locally compact space, continuity is equivalent to local uniform continuity, and thus the uniform limit of continuous functions is continuous.

To differentiability

If is an interval and all the functions are differentiable and converge to a limit , it is often desirable to determine the derivative function by taking the limit of the sequence . This is however in general not possible: even if the convergence is uniform, the limit function need not be differentiable (not even if the sequence consists of everywhere-analytic functions, see Weierstrass function), and even if it is differentiable, the derivative of the limit function need not be equal to the limit of the derivatives. Consider for instance with uniform limit . Clearly, is also identically zero. However, the derivatives of the sequence of functions are given by and the sequence does not converge to or even to any function at all. In order to ensure a connection between the limit of a sequence of differentiable functions and the limit of the sequence of derivatives, the uniform convergence of the sequence of derivatives plus the convergence of the sequence of functions at at least one point is required: [4]

If is a sequence of differentiable functions on such that exists (and is finite) for some and the sequence converges uniformly on , then converges uniformly to a function on , and for .

To integrability

Similarly, one often wants to exchange integrals and limit processes. For the Riemann integral, this can be done if uniform convergence is assumed:

If is a sequence of Riemann integrable functions defined on a compact interval which uniformly converge with limit , then is Riemann integrable and its integral can be computed as the limit of the integrals of the :

In fact, for a uniformly convergent family of bounded functions on an interval, the upper and lower Riemann integrals converge to the upper and lower Riemann integrals of the limit function. This follows because, for n sufficiently large, the graph of is within ε of the graph of f, and so the upper sum and lower sum of are each within of the value of the upper and lower sums of , respectively.

Much stronger theorems in this respect, which require not much more than pointwise convergence, can be obtained if one abandons the Riemann integral and uses the Lebesgue integral instead.

To analyticity

Using Morera's Theorem, one can show that if a sequence of analytic functions converges uniformly in a region S of the complex plane, then the limit is analytic in S. This example demonstrates that complex functions are more well-behaved than real functions, since the uniform limit of analytic functions on a real interval need not even be differentiable (see Weierstrass function).

To series

We say that converges:

  1. pointwise on E if and only if the sequence of partial sums converges for every .
  2. uniformly on E if and only if sn converges uniformly as .
  3. absolutely on E if and only if converges for every .

With this definition comes the following result:

Let x0 be contained in the set E and each fn be continuous at x0. If converges uniformly on E then f is continuous at x0 in E. Suppose that and each fn is integrable on E. If converges uniformly on E then f is integrable on E and the series of integrals of fn is equal to integral of the series of fn.

Almost uniform convergence

If the domain of the functions is a measure space E then the related notion of almost uniform convergence can be defined. We say a sequence of functions converges almost uniformly on E if for every there exists a measurable set with measure less than such that the sequence of functions converges uniformly on . In other words, almost uniform convergence means there are sets of arbitrarily small measure for which the sequence of functions converges uniformly on their complement.

Note that almost uniform convergence of a sequence does not mean that the sequence converges uniformly almost everywhere as might be inferred from the name. However, Egorov's theorem does guarantee that on a finite measure space, a sequence of functions that converges almost everywhere also converges almost uniformly on the same set.

Almost uniform convergence implies almost everywhere convergence and convergence in measure.

See also

Notes

  1. Sørensen, Henrik Kragh (2005). "Exceptions and counterexamples: Understanding Abel's comment on Cauchy's Theorem". Historia Mathematica. 32 (4): 453–480. doi:10.1016/j.hm.2004.11.010.
  2. Jahnke, Hans Niels (2003). "6.7 The Foundation of Analysis in the 19th Century: Weierstrass". A history of analysis. AMS Bookstore. p. 184. ISBN   978-0-8218-2623-2.
  3. Lakatos, Imre (1976). Proofs and Refutations. Cambridge University Press. pp.  141. ISBN   978-0-521-21078-2.
  4. Rudin, Walter (1976). Principles of Mathematical Analysis 3rd edition, Theorem 7.17. McGraw-Hill: New York.

Related Research Articles

In mathematics, the branch of real analysis studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.

In probability theory, there exist several different notions of convergence of sequences of random variables, including convergence in probability, convergence in distribution, and almost sure convergence. The different notions of convergence capture different properties about the sequence, with some notions of convergence being stronger than others. For example, convergence in distribution tells us about the limit distribution of a sequence of random variables. This is a weaker notion than convergence in probability, which tells us about the value a random variable will take, rather than just the distribution.

In the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the good convergence behaviour of monotonic sequences, i.e. sequences that are non-increasing, or non-decreasing. In its simplest form, it says that a non-decreasing bounded-above sequence of real numbers converges to its smallest upper bound, its supremum. Likewise, a non-increasing bounded-below sequence converges to its largest lower bound, its infimum. In particular, infinite sums of non-negative numbers converge to the supremum of the partial sums if and only if the partial sums are bounded.

In mathematics, the uniform boundedness principle or Banach–Steinhaus theorem is one of the fundamental results in functional analysis. Together with the Hahn–Banach theorem and the open mapping theorem, it is considered one of the cornerstones of the field. In its basic form, it asserts that for a family of continuous linear operators whose domain is a Banach space, pointwise boundedness is equivalent to uniform boundedness in operator norm.

In mathematics, the limit of a function is a fundamental concept in calculus and analysis concerning the behavior of that function near a particular input which may or may not be in the domain of the function.

<span class="mw-page-title-main">Limit of a sequence</span> Value to which tends an infinite sequence

In mathematics, the limit of a sequence is the value that the terms of a sequence "tend to", and is often denoted using the symbol. If such a limit exists, the sequence is called convergent. A sequence that does not converge is said to be divergent. The limit of a sequence is said to be the fundamental notion on which the whole of mathematical analysis ultimately rests.

In mathematics, Fatou's lemma establishes an inequality relating the Lebesgue integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions. The lemma is named after Pierre Fatou.

In mathematics, pointwise convergence is one of various senses in which a sequence of functions can converge to a particular function. It is weaker than uniform convergence, to which it is often compared.

In measure theory, Lebesgue's dominated convergence theorem gives a mild sufficient condition under which limits and integrals of a sequence of functions can be interchanged. More technically it says that if a sequence of functions is bounded in absolute value by an integrable function and is almost everywhere point wise convergent to a function then the sequence convergences in to its point wise limit, and in particular the integral of the limit is the limit of the integrals. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration.

The Arzelà–Ascoli theorem is a fundamental result of mathematical analysis giving necessary and sufficient conditions to decide whether every sequence of a given family of real-valued continuous functions defined on a closed and bounded interval has a uniformly convergent subsequence. The main condition is the equicontinuity of the family of functions. The theorem is the basis of many proofs in mathematics, including that of the Peano existence theorem in the theory of ordinary differential equations, Montel's theorem in complex analysis, and the Peter–Weyl theorem in harmonic analysis and various results concerning compactness of integral operators.

In mathematics, a series is the sum of the terms of an infinite sequence of numbers. More precisely, an infinite sequence defines a series S that is denoted

In measure theory, an area of mathematics, Egorov's theorem establishes a condition for the uniform convergence of a pointwise convergent sequence of measurable functions. It is also named Severini–Egoroff theorem or Severini–Egorov theorem, after Carlo Severini, an Italian mathematician, and Dmitri Egorov, a Russian physicist and geometer, who published independent proofs respectively in 1910 and 1911.

In mathematics, especially functional analysis, a Fréchet algebra, named after Maurice René Fréchet, is an associative algebra over the real or complex numbers that at the same time is also a Fréchet space. The multiplication operation for is required to be jointly continuous. If is an increasing family of seminorms for the topology of , the joint continuity of multiplication is equivalent to there being a constant and integer for each such that for all . Fréchet algebras are also called B0-algebras.

In mathematics, Fejér's theorem, named after Hungarian mathematician Lipót Fejér, states the following:

In real analysis and measure theory, the Vitali convergence theorem, named after the Italian mathematician Giuseppe Vitali, is a generalization of the better-known dominated convergence theorem of Henri Lebesgue. It is a characterization of the convergence in Lp in terms of convergence in measure and a condition related to uniform integrability.

In mathematics, a limit is the value that a function approaches as the argument approaches some value. Limits of functions are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals. The concept of a limit of a sequence is further generalized to the concept of a limit of a topological net, and is closely related to limit and direct limit in category theory. The limit inferior and limit superior provide generalizations of the concept of a limit which are particularly relevant when the limit at a point may not exist.

In multivariable calculus, an iterated limit is a limit of a sequence or a limit of a function in the form

In mathematics, Kingman's subadditive ergodic theorem is one of several ergodic theorems. It can be seen as a generalization of Birkhoff's ergodic theorem. Intuitively, the subadditive ergodic theorem is a kind of random variable version of Fekete's lemma. As a result, it can be rephrased in the language of probability, e.g. using a sequence of random variables and expected values. The theorem is named after John Kingman.

Convergence proof techniques are canonical patterns of mathematical proofs that sequences or functions converge to a finite limit when the argument tends to infinity.

This is a glossary of concepts and results in real analysis and complex analysis in mathematics.

References

"Uniform convergence", Encyclopedia of Mathematics , EMS Press, 2001 [1994]