Dini's theorem

Last updated

In the mathematical field of analysis, Dini's theorem says that if a monotone sequence of continuous functions converges pointwise on a compact space and if the limit function is also continuous, then the convergence is uniform. [1]

Contents

Formal statement

If is a compact topological space, and is a monotonically increasing sequence (meaning for all and ) of continuous real-valued functions on which converges pointwise to a continuous function , then the convergence is uniform. The same conclusion holds if is monotonically decreasing instead of increasing. The theorem is named after Ulisse Dini. [2]

This is one of the few situations in mathematics where pointwise convergence implies uniform convergence; the key is the greater control implied by the monotonicity. The limit function must be continuous, since a uniform limit of continuous functions is necessarily continuous. The continuity of the limit function cannot be inferred from the other hypothesis (consider in .)

Proof

Let be given. For each , let , and let be the set of those such that . Each is continuous, and so each is open (because each is the preimage of the open set under , a continuous function). Since is monotonically increasing, is monotonically decreasing, it follows that the sequence is ascending (i.e. for all ). Since converges pointwise to , it follows that the collection is an open cover of . By compactness, there is a finite subcover, and since are ascending the largest of these is a cover too. Thus we obtain that there is some positive integer such that . That is, if and is a point in , then , as desired.

Notes

  1. Edwards 1994 , p. 165. Friedman 2007 , p. 199. Graves 2009 , p. 121. Thomson, Bruckner & Bruckner 2008 , p. 385.
  2. According to Edwards 1994 , p. 165, "[This theorem] is called Dini's theorem because Ulisse Dini (1845–1918) presented the original version of it in his book on the theory of functions of a real variable, published in Pisa in 1878".

Related Research Articles

In mathematics, a continuous function is a function such that a small variation of the argument induces a small variation of the value of the function. This implies there are no abrupt changes in value, known as discontinuities. More precisely, a function is continuous if arbitrarily small changes in its value can be assured by restricting to sufficiently small changes of its argument. A discontinuous function is a function that is not continuous. Until the 19th century, mathematicians largely relied on intuitive notions of continuity and considered only continuous functions. The epsilon–delta definition of a limit was introduced to formalize the definition of continuity.

In mathematics, the branch of real analysis studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.

<span class="mw-page-title-main">Uniform convergence</span> Mode of convergence of a function sequence

In the mathematical field of analysis, uniform convergence is a mode of convergence of functions stronger than pointwise convergence, in the sense that the convergence is uniform over the domain. A sequence of functions converges uniformly to a limiting function on a set as the function domain if, given any arbitrarily small positive number , a number can be found such that each of the functions differs from by no more than at every pointin. Described in an informal way, if converges to uniformly, then how quickly the functions approach is "uniform" throughout in the following sense: in order to guarantee that differs from by less than a chosen distance , we only need to make sure that is larger than or equal to a certain , which we can find without knowing the value of in advance. In other words, there exists a number that could depend on but is independent of , such that choosing will ensure that for all . In contrast, pointwise convergence of to merely guarantees that for any given in advance, we can find such that, for that particular, falls within of whenever .

In probability theory, there exist several different notions of convergence of sequences of random variables. The different notions of convergence capture different properties about the sequence, with some notions of convergence being stronger than others. For example, convergence in distribution tells us about the limit distribution of a sequence of random variables. This is a weaker notion than convergence in probability, which tells us about the value a random variable will take, rather than just the distribution.

In mathematics, the uniform boundedness principle or Banach–Steinhaus theorem is one of the fundamental results in functional analysis. Together with the Hahn–Banach theorem and the open mapping theorem, it is considered one of the cornerstones of the field. In its basic form, it asserts that for a family of continuous linear operators whose domain is a Banach space, pointwise boundedness is equivalent to uniform boundedness in operator norm.

<span class="mw-page-title-main">Limit of a sequence</span> Value to which tends an infinite sequence

In mathematics, the limit of a sequence is the value that the terms of a sequence "tend to", and is often denoted using the symbol. If such a limit exists, the sequence is called convergent. A sequence that does not converge is said to be divergent. The limit of a sequence is said to be the fundamental notion on which the whole of mathematical analysis ultimately rests.

In mathematics, Fatou's lemma establishes an inequality relating the Lebesgue integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions. The lemma is named after Pierre Fatou.

In measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which almost everywhere convergence of a sequence of functions implies convergence in the L1 norm. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration.

In mathematical analysis, a family of functions is equicontinuous if all the functions are continuous and they have equal variation over a given neighbourhood, in a precise sense described herein. In particular, the concept applies to countable families, and thus sequences of functions.

The Arzelà–Ascoli theorem is a fundamental result of mathematical analysis giving necessary and sufficient conditions to decide whether every sequence of a given family of real-valued continuous functions defined on a closed and bounded interval has a uniformly convergent subsequence. The main condition is the equicontinuity of the family of functions. The theorem is the basis of many proofs in mathematics, including that of the Peano existence theorem in the theory of ordinary differential equations, Montel's theorem in complex analysis, and the Peter–Weyl theorem in harmonic analysis and various results concerning compactness of integral operators.

In mathematics, the Dirichlet–Jordan test gives sufficient conditions for a real-valued, periodic function f to be equal to the sum of its Fourier series at a point of continuity. Moreover, the behavior of the Fourier series at points of discontinuity is determined as well. It is one of many conditions for the convergence of Fourier series.

In measure theory, an area of mathematics, Egorov's theorem establishes a condition for the uniform convergence of a pointwise convergent sequence of measurable functions. It is also named Severini–Egoroff theorem or Severini–Egorov theorem, after Carlo Severini, an Italian mathematician, and Dmitri Egorov, a Russian physicist and geometer, who published independent proofs respectively in 1910 and 1911.

In the mathematical field of real analysis, Lusin's theorem or Lusin's criterion states that an almost-everywhere finite function is measurable if and only if it is a continuous function on nearly all its domain. In the informal formulation of J. E. Littlewood, "every measurable function is nearly continuous".

In the theory of probability, the Glivenko–Cantelli theorem, named after Valery Ivanovich Glivenko and Francesco Paolo Cantelli, describes the asymptotic behaviour of the empirical distribution function as the number of independent and identically distributed observations grows. Specifically, the empirical distribution function converges uniformly to the true distribution function almost surely.

In mathematics, a sequence of functions from a set S to a metric space M is said to be uniformly Cauchy if:

Convergence in measure is either of two distinct mathematical concepts both of which generalize the concept of convergence in probability.

<span class="mw-page-title-main">Dvoretzky–Kiefer–Wolfowitz inequality</span> Statistical inequality

In the theory of probability and statistics, the Dvoretzky–Kiefer–Wolfowitz–Massart inequality provides a bound on the worst case distance of an empirically determined distribution function from its associated population distribution function. It is named after Aryeh Dvoretzky, Jack Kiefer, and Jacob Wolfowitz, who in 1956 proved the inequality

In real analysis and measure theory, the Vitali convergence theorem, named after the Italian mathematician Giuseppe Vitali, is a generalization of the better-known dominated convergence theorem of Henri Lebesgue. It is a characterization of the convergence in Lp in terms of convergence in measure and a condition related to uniform integrability.

<span class="mw-page-title-main">Uniform limit theorem</span>

In mathematics, the uniform limit theorem states that the uniform limit of any sequence of continuous functions is continuous.

Convergence proof techniques are canonical components of mathematical proofs that sequences or functions converge to a finite limit when the argument tends to infinity.

References