In mathematics, a **limit** is the value that a function (or sequence) approaches as the input (or index) approaches some value.^{ [1] } Limits are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals.

- Limit of a function
- Limit of a sequence
- Limit as "standard part"
- Convergence and fixed point
- Computability of the limit
- See also
- Notes
- References
- External links

The concept of a limit of a sequence is further generalized to the concept of a limit of a topological net, and is closely related to limit and direct limit in category theory.

In formulas, a limit of a function is usually written as

and is read as "the limit of *f* of x as x approaches c equals *L*". The fact that a function *f* approaches the limit *L* as x approaches c is sometimes denoted by a right arrow (→ or ), as in:

which reads " of tends to as tends to ".^{ [2] }

Suppose *f* is a real-valued function and c is a real number. Intuitively speaking, the expression

means that *f*(*x*) can be made to be as close to *L* as desired, by making x sufficiently close to c.^{ [3] } In that case, the above equation can be read as "the limit of *f* of x, as x approaches c, is *L*".

Augustin-Louis Cauchy in 1821,^{ [4] } followed by Karl Weierstrass, formalized the definition of the limit of a function which became known as the (ε, δ)-definition of limit. The definition uses ε (the lowercase Greek letter *epsilon*)^{ [2] } to represent any small positive number, so that "*f*(*x*) becomes arbitrarily close to *L*" means that *f*(*x*) eventually lies in the interval (*L* − *ε*, *L* + *ε*), which can also be written using the absolute value sign as |*f*(*x*) − *L*| < *ε*.^{ [4] } The phrase "as x approaches c" then indicates that we refer to values of x, whose distance from c is less than some positive number δ (the lower case Greek letter *delta*)—that is, values of x within either (*c* − *δ*, *c*) or (*c*, *c* + *δ*), which can be expressed with 0 < |*x* − *c*| < *δ*. The first inequality means that the distance between x and c is greater than 0 and that *x* ≠ *c*, while the second indicates that x is within distance δ of c.^{ [4] }

The above definition of a limit is true even if *f*(*c*) ≠ *L*. Indeed, the function *f* need not even be defined at c.

For example, if

then *f*(1) is not defined (see indeterminate forms), yet as x moves arbitrarily close to 1, *f*(*x*) correspondingly approaches 2:^{ [5] }

f(0.9) | f(0.99) | f(0.999) | f(1.0) | f(1.001) | f(1.01) | f(1.1) |

1.900 | 1.990 | 1.999 | undefined | 2.001 | 2.010 | 2.100 |

Thus, *f*(*x*) can be made arbitrarily close to the limit of 2—just by making x sufficiently close to 1.

In other words,

This can also be calculated algebraically, as for all real numbers *x* ≠ 1.

Now, since *x* + 1 is continuous in x at 1, we can now plug in 1 for x, leading to the equation

In addition to limits at finite values, functions can also have limits at infinity. For example, consider the function

where:

*f*(100) = 1.9900*f*(1000) = 1.9990*f*(10000) = 1.9999

As x becomes extremely large, the value of *f*(*x*) approaches 2, and the value of *f*(*x*) can be made as close to 2 as one could wish—by making x sufficiently large. So in this case, the limit of *f*(*x*) as x approaches infinity is 2, or in mathematical notation,

Consider the following sequence: 1.79, 1.799, 1.7999, … It can be observed that the numbers are "approaching" 1.8, the limit of the sequence.

Formally, suppose *a*_{1}, *a*_{2}, … is a sequence of real numbers. One can state that the real number *L* is the *limit* of this sequence, namely:

which is read as

- "The limit of
*a*as_{n}*n*approaches infinity equals*L*"

if and only if

- For every real number
*ε*> 0, there exists a natural number*N*such that for all*n*>*N*, we have |*a*_{n}−*L*| <*ε*.^{ [6] }

Intuitively, this means that eventually, all elements of the sequence get arbitrarily close to the limit, since the absolute value |*a*_{n} − *L*| is the distance between *a*_{n} and *L*. Not every sequence has a limit; if it does, then it is called * convergent *, and if it does not, then it is *divergent*. One can show that a convergent sequence has only one limit.

The limit of a sequence and the limit of a function are closely related. On one hand, the limit as n approaches infinity of a sequence {*a*_{n}} is simply the limit at infinity of a function *a*(*n*)—defined on the natural numbers {*n*}. On the other hand, if *X* is the domain of a function *f*(*x*) and if the limit as n approaches infinity of *f*(*x*_{n}) is *L* for *every* arbitrary sequence of points {*x*_{n}} in {*X* – {*x*_{0}}} which converges to *x*_{0}, then the limit of the function *f*(*x*) as *x* approaches *x*_{0} is *L*.^{ [7] } One such sequence would be {*x*_{0} + 1/*n*}.

In non-standard analysis (which involves a hyperreal enlargement of the number system), the limit of a sequence can be expressed as the standard part of the value of the natural extension of the sequence at an infinite hypernatural index *n=H*. Thus,

Here, the standard part function "st" rounds off each finite hyperreal number to the nearest real number (the difference between them is infinitesimal). This formalizes the natural intuition that for "very large" values of the index, the terms in the sequence are "very close" to the limit value of the sequence. Conversely, the standard part of a hyperreal represented in the ultrapower construction by a Cauchy sequence , is simply the limit of that sequence:

In this sense, taking the limit and taking the standard part are equivalent procedures.

A formal definition of convergence can be stated as follows. Suppose as goes from to is a sequence that converges to , with for all . If positive constants and exist with

then as goes from to converges to of order , with asymptotic error constant .

Given a function with a fixed point , there is a nice checklist for checking the convergence of the sequence .

- First check that p is indeed a fixed point:
- Check for linear convergence. Start by finding . If…

then there is linear convergence | |

series diverges | |

then there is at least linear convergence and maybe something better, the expression should be checked for quadratic convergence |

- If it is found that there is something better than linear, the expression should be checked for quadratic convergence. Start by finding If…

then there is quadratic convergence provided that is continuous | |

then there is something even better than quadratic convergence | |

does not exist | then there is convergence that is better than linear but still not quadratic |

^{ [8] }

Limits can be difficult to compute. There exist limit expressions whose modulus of convergence is undecidable. In recursion theory, the limit lemma proves that it is possible to encode undecidable problems using limits.^{ [9] }

The Wikibook Calculus has a page on the topic of: Limits |

- Asymptotic analysis: a method of describing limiting behavior
- Big O notation: used to describe the limiting behavior of a function when the argument tends towards a particular value or infinity

- Banach limit defined on the Banach space that extends the usual limits.
- Cauchy sequence
- Convergence of random variables
- Convergent matrix
- Limit in category theory
- Limit of a function
- One-sided limit: either of the two limits of functions of a real variable
*x*, as*x*approaches a point from above or below - List of limits: list of limits for common functions
- Squeeze theorem: finds a limit of a function via comparison with two other functions

- One-sided limit: either of the two limits of functions of a real variable
- Limit point
- Limit set
- Limit superior and limit inferior
- Modes of convergence
- Rate of convergence: the rate at which a convergent sequence approaches its limit

- ↑ Stewart, James (2008).
*Calculus: Early Transcendentals*(6th ed.). Brooks/Cole. ISBN 978-0-495-01166-8. - 1 2 "List of Calculus and Analysis Symbols".
*Math Vault*. 2020-05-11. Retrieved 2020-08-18. - ↑ Weisstein, Eric W. "Epsilon-Delta Definition".
*mathworld.wolfram.com*. Retrieved 2020-08-18. - 1 2 3 Larson, Ron; Edwards, Bruce H. (2010).
*Calculus of a single variable*(Ninth ed.). Brooks/Cole, Cengage Learning. ISBN 978-0-547-20998-2. - ↑ "limit | Definition, Example, & Facts".
*Encyclopedia Britannica*. Retrieved 2020-08-18. - ↑ Weisstein, Eric W. "Limit".
*mathworld.wolfram.com*. Retrieved 2020-08-18. - ↑ Apostol (1974 , pp. 75–76)
- ↑
*Numerical Analysis*, 8th Edition, Burden and Faires, Section 2.4 Error Analysis for Iterative Methods - ↑
*Recursively enumerable sets and degrees*, Soare, Robert I.

In mathematics, a **continuous function** is a function that does not have any abrupt changes in value, known as discontinuities. More precisely, a function is continuous if arbitrarily small changes in its output can be assured by restricting to sufficiently small changes in its input. If not continuous, a function is said to be *discontinuous*. Up until the 19th century, mathematicians largely relied on intuitive notions of continuity, during which attempts such as the epsilon–delta definition were made to formalize it.

In mathematics, more specifically calculus, **L'Hôpital's rule** or **L'Hospital's rule** provides a technique to evaluate limits of indeterminate forms. Application of the rule often converts an indeterminate form to an expression that can be easily evaluated by substitution. The rule is named after the 17th-century French mathematician Guillaume de l'Hôpital. Although the rule is often attributed to L'Hôpital, the theorem was first introduced to him in 1694 by the Swiss mathematician Johann Bernoulli.

In the branch of mathematics known as real analysis, the **Riemann integral**, created by Bernhard Riemann, was the first rigorous definition of the integral of a function on an interval. It was presented to the faculty at the University of Göttingen in 1854, but not published in a journal until 1868. For many functions and practical applications, the Riemann integral can be evaluated by the fundamental theorem of calculus or approximated by numerical integration.

In mathematics, **real analysis** is the branch of mathematical analysis that studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.

In mathematics, a **sequence** is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members. The number of elements is called the *length* of the sequence. Unlike a set, the same elements can appear multiple times at different positions in a sequence, and unlike a set, the order does matter. Formally, a sequence can be defined as a function whose domain is either the set of the natural numbers, or the set of the first *n* natural numbers. Sequences are one type of indexed families as an indexed family is defined as a function which domain is called the index set, and the elements of the index set are the indices for the elements of the function image.

In mathematics, the **Dirac delta function** is a generalized function or distribution introduced by physicist Paul Dirac. It is called a function, although it is not a function on the level one would expect, that is, it is not a function **R** → **C**, but a function on the space of test functions. It is used to model the density of an idealized point mass or point charge as a function equal to zero everywhere except for zero and whose integral over the entire real line is equal to one. As there is no function that has these properties, the computations made by theoretical physicists appeared to mathematicians as nonsense until the introduction of distributions by Laurent Schwartz to formalize and validate the computations. As a distribution, the Dirac delta function is a linear functional that maps every function to its value at zero. The Kronecker delta function, which is usually defined on a discrete domain and takes values 0 and 1, is a discrete analog of the Dirac delta function.

In the mathematical field of analysis, **uniform convergence** is a mode of convergence of functions stronger than pointwise convergence. A sequence of functions **converges uniformly** to a limiting function on a set if, given any arbitrarily small positive number , a number can be found such that each of the functions differ from by no more than *at every point**in*. Described in an informal way, if converges to uniformly, then the rate at which approaches is "uniform" throughout its domain in the following sense: in order to guarantee that falls within a certain distance of , we do not need to know the value of in question — there can be found a single value of *independent of *, such that choosing will ensure that is within of *for all *. In contrast, pointwise convergence of to merely guarantees that for any given in advance, we can find so that, *for that particular**, falls within of whenever .*

In mathematics, the **affinely extended real number system** is obtained from the real number system by adding two infinity elements: and , where the infinities are treated as actual numbers. It is useful in describing the algebra on infinities and the various limiting behaviors in calculus and mathematical analysis, especially in the theory of measure and integration. The affinely extended real number system is denoted or or . It is the Dedekind–MacNeille completion of the real numbers.

The **Heaviside step function**, or the **unit step function**, usually denoted by H or θ, is a step function, named after Oliver Heaviside (1850–1925), the value of which is zero for negative arguments and one for positive arguments. It is an example of the general class of step functions, all of which can be represented as linear combinations of translations of this one.

In mathematics, the **limit of a function** is a fundamental concept in calculus and analysis concerning the behavior of that function near a particular input.

In mathematics, the **limit of a sequence** is the value that the terms of a sequence "tend to", and is often denoted using the symbol. If such a limit exists, the sequence is called **convergent**. A sequence that does not converge is said to be **divergent**. The limit of a sequence is said to be the fundamental notion on which the whole of mathematical analysis ultimately rests.

In calculus, the **squeeze theorem**, also known as the **pinching theorem**, the **sandwich theorem**, the **sandwich rule**, the **police theorem**, the **between theorem** and sometimes the **squeeze lemma**, is a theorem regarding the limit of a function. In Italy, the theorem is also known as **theorem of carabinieri**.

In mathematics, the **Cauchy principal value**, named after Augustin Louis Cauchy, is a method for assigning values to certain improper integrals which would otherwise be undefined.

In mathematics, the **oscillation** of a function or a sequence is a number that quantifies how much that sequence or function varies between its extreme values as it approaches infinity or a point. As is the case with limits, there are several definitions that put the intuitive concept into a form suitable for a mathematical treatment: oscillation of a sequence of real numbers, oscillation of a real-valued function at a point, and oscillation of a function on an interval.

In mathematics, **nonstandard calculus** is the modern application of infinitesimals, in the sense of nonstandard analysis, to infinitesimal calculus. It provides a rigorous justification for some arguments in calculus that were previously considered merely heuristic.

In mathematics, a **divergent series** is an infinite series that is not convergent, meaning that the infinite sequence of the partial sums of the series does not have a finite limit.

In statistics, a **consistent estimator** or **asymptotically consistent estimator** is an estimator—a rule for computing estimates of a parameter *θ*_{0}—having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to *θ*_{0}. This means that the distributions of the estimates become more and more concentrated near the true value of the parameter being estimated, so that the probability of the estimator being arbitrarily close to *θ*_{0} converges to one.

In mathematics, **uniform integrability** is an important concept in real analysis, functional analysis and measure theory, and plays a vital role in the theory of martingales. The definition used in measure theory is closely related to, but not identical to, the definition typically used in probability.

In real analysis, a branch of mathematics, a **slowly varying function** is a function of a real variable whose behaviour at infinity is in some sense similar to the behaviour of a function converging at infinity. Similarly, a **regularly varying function** is a function of a real variable whose behaviour at infinity is similar to the behaviour of a power law function near infinity. These classes of functions were both introduced by Jovan Karamata, and have found several important applications, for example in probability theory.

In calculus, the **( ε, δ)-definition of limit** is a formalization of the notion of limit. The concept is due to Augustin-Louis Cauchy, who never gave a formal definition of limit in his

- Apostol, Tom M. (1974),
*Mathematical Analysis*(2nd ed.), Menlo Park: Addison-Wesley, LCCN 72011473

Library resources about Limit (mathematics) |

This page is based on this Wikipedia article

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.