Slowly varying function

Last updated

In real analysis, a branch of mathematics, a slowly varying function is a function of a real variable whose behaviour at infinity is in some sense similar to the behaviour of a function converging at infinity. Similarly, a regularly varying function is a function of a real variable whose behaviour at infinity is similar to the behaviour of a power law function (like a polynomial) near infinity. These classes of functions were both introduced by Jovan Karamata, [1] [2] and have found several important applications, for example in probability theory.

Contents

Basic definitions

Definition 1. A measurable function L : (0, +)  (0, +) is called slowly varying (at infinity) if for all a > 0,

Definition 2. Let L : (0, +)  (0, +). Then L is a regularly varying function if and only if . In particular, the limit must be finite.

These definitions are due to Jovan Karamata. [1] [2]

Note. In the regularly varying case, the sum of two slowly varying functions is again slowly varying function.

Basic properties

Regularly varying functions have some important properties: [1] a partial list of them is reported below. More extensive analyses of the properties characterizing regular variation are presented in the monograph by Bingham, Goldie & Teugels (1987).

Uniformity of the limiting behaviour

Theorem 1. The limit in definitions 1 and 2 is uniform if a is restricted to a compact interval.

Karamata's characterization theorem

Theorem 2. Every regularly varying function f : (0, +)  (0, +) is of the form

where

Note. This implies that the function g(a) in definition 2 has necessarily to be of the following form

where the real number ρ is called the index of regular variation.

Karamata representation theorem

Theorem 3. A function L is slowly varying if and only if there exists B > 0 such that for all xB the function can be written in the form

where

Examples

then L is a slowly varying function.

See also

Notes

  1. 1 2 3 See ( Galambos & Seneta 1973 )
  2. 1 2 See ( Bingham, Goldie & Teugels 1987 ).

Related Research Articles

LHôpitals rule Mathematical rule for evaluating certain limits

In mathematics, more specifically calculus, L'Hôpital's rule or L'Hospital's rule, also known as Bernoulli's rule, is a theorem which provides a technique to evaluate limits of indeterminate forms. Application of the rule often converts an indeterminate form to an expression that can be easily evaluated by substitution. The rule is named after the 17th-century French mathematician Guillaume de l'Hôpital. Although the rule is often attributed to L'Hôpital, the theorem was first introduced to him in 1694 by the Swiss mathematician Johann Bernoulli.

In mathematics, the Laplace transform, named after its inventor Pierre-Simon Laplace, is an integral transform that converts a function of a real variable to a function of a complex variable . The transform has many applications in science and engineering because it is a tool for solving differential equations. In particular, it transforms linear differential equations into algebraic equations and convolution into multiplication. For suitable functions f, the Laplace transform is the integral

Real analysis Mathematics of real numbers and real functions

In mathematics, real analysis is the branch of mathematical analysis that studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.

In probability theory, the central limit theorem (CLT) establishes that, in many situations, when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed.

In mathematics, the radius of convergence of a power series is the radius of the largest disk at the center of the series in which the series converges. It is either a non-negative real number or . When it is positive, the power series converges absolutely and uniformly on compact sets inside the open disk of radius equal to the radius of convergence, and it is the Taylor series of the analytic function to which it converges. In case of multiple singularities of a function, the radius of convergence is the shortest or minimum of all the respective distances calculated from the center of the disk of convergence to the respective singularities of the function.

In the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the convergence of monotonic sequences that are also bounded. Informally, the theorems state that if a sequence is increasing and bounded above by a supremum, then the sequence will converge to the supremum; in the same way, if a sequence is decreasing and is bounded below by an infimum, it will converge to the infimum.

In mathematics, the limit of a function is a fundamental concept in calculus and analysis concerning the behavior of that function near a particular input.

Limit of a sequence Value to which tends an infinite sequence

In mathematics, the limit of a sequence is the value that the terms of a sequence "tend to", and is often denoted using the symbol. If such a limit exists, the sequence is called convergent. A sequence that does not converge is said to be divergent. The limit of a sequence is said to be the fundamental notion on which the whole of mathematical analysis ultimately rests.

In mathematics, Fatou's lemma establishes an inequality relating the Lebesgue integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions. The lemma is named after Pierre Fatou.

In mathematics, the Radon–Nikodym theorem is a result in measure theory that expresses the relationship between two measures defined on the same measurable space. A measure is a set function that assigns a consistent magnitude to the measurable subsets of a measurable space. Examples of a measure include area and volume, where the subsets are sets of points; or the probability of an event, which is a subset of possible outcomes within a wider probability space.

In measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which almost everywhere convergence of a sequence of functions implies convergence in the L1 norm. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration.

In calculus and other branches of mathematical analysis, limits involving an algebraic combination of functions in an independent variable may often be evaluated by replacing these functions by their limits; if the expression obtained after this substitution does not provide sufficient information to determine the original limit, then the expression is called an indeterminate form. More specifically, an indeterminate form is a mathematical expression involving at most two of , or , obtained by applying the algebraic limit theorem in the process of attempting to determine a limit, which fails to restrict that limit to one specific value or infinity, and thus does not determine the limit being sought. A limit confirmed to be infinity is not indeterminate since it has been determined to have a specific value (infinity). The term was originally introduced by Cauchy's student Moigno in the middle of the 19th century.

Oscillation (mathematics) Amount of variation between extrema

In mathematics, the oscillation of a function or a sequence is a number that quantifies how much that sequence or function varies between its extreme values as it approaches infinity or a point. As is the case with limits, there are several definitions that put the intuitive concept into a form suitable for a mathematical treatment: oscillation of a sequence of real numbers, oscillation of a real-valued function at a point, and oscillation of a function on an interval.

In mathematics, Hausdorff measure is a generalization of the traditional notions of area and volume to non-integer dimensions, specifically fractals and their Hausdorff dimensions. It is a type of outer measure, named for Felix Hausdorff, that assigns a number in [0,∞] to each set in or, more generally, in any metric space.

Consistent estimator Statistical estimator converging in probability to a true parameter as sample size increases

In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter θ0—having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ0. This means that the distributions of the estimates become more and more concentrated near the true value of the parameter being estimated, so that the probability of the estimator being arbitrarily close to θ0 converges to one.

In mathematics, uniform integrability is an important concept in real analysis, functional analysis and measure theory, and plays a vital role in the theory of martingales.

In mathematical analysis, the final value theorem (FVT) is one of several similar theorems used to relate frequency domain expressions to the time domain behavior as time approaches infinity. Mathematically, if in continuous time has (unilateral) Laplace transform , then a final value theorem establishes conditions under which

In mathematical analysis, the Hardy–Littlewood Tauberian theorem is a Tauberian theorem relating the asymptotics of the partial sums of a series with the asymptotics of its Abel summation. In this form, the theorem asserts that if, as y ↓ 0, the non-negative sequence an is such that there is an asymptotic equivalence

Uncertainty theory is a branch of mathematics based on normality, monotonicity, self-duality, countable subadditivity, and product measure axioms.

In mathematics, a limit is the value that a function approaches as the input approaches some value. Limits are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals.

References