Chebyshev's sum inequality

Last updated

In mathematics, Chebyshev's sum inequality, named after Pafnuty Chebyshev, states that if

Contents

and

then

Similarly, if

and

then

[1]

Proof

Consider the sum

The two sequences are non-increasing, therefore aj  ak and bj  bk have the same sign for any j, k. Hence S  0.

Opening the brackets, we deduce:

hence

An alternative proof is simply obtained with the rearrangement inequality, writing that

Continuous version

There is also a continuous version of Chebyshev's sum inequality:

If f and g are real-valued, integrable functions over [a, b], both non-increasing or both non-decreasing, then

with the inequality reversed if one is non-increasing and the other is non-decreasing.

See also

Notes

  1. Hardy, G. H.; Littlewood, J. E.; Pólya, G. (1988). Inequalities. Cambridge Mathematical Library. Cambridge: Cambridge University Press. ISBN   0-521-35880-9. MR   0944909.

Related Research Articles

<span class="mw-page-title-main">Expected value</span> The average value of a random variable

In probability theory, the expected value is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a large number of independently selected outcomes of a random variable.

In mathematics, the prime number theorem (PNT) describes the asymptotic distribution of the prime numbers among the positive integers. It formalizes the intuitive idea that primes become less common as they become larger by precisely quantifying the rate at which this occurs. The theorem was proved independently by Jacques Hadamard and Charles Jean de la Vallée Poussin in 1896 using ideas introduced by Bernhard Riemann.

The Cauchy–Schwarz inequality is considered one of the most important and widely used inequalities in mathematics.

In the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the convergence of monotonic sequences that are also bounded. Informally, the theorems state that if a sequence is increasing and bounded above by a supremum, then the sequence will converge to the supremum; in the same way, if a sequence is decreasing and is bounded below by an infimum, it will converge to the infimum.

In probability theory, Chebyshev's inequality guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be k or more standard deviations away from the mean. The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.

In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.

In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov, although it appeared earlier in the work of Pafnuty Chebyshev, and many sources, especially in analysis, refer to it as Chebyshev's inequality or Bienaymé's inequality.

In vector calculus, Green's theorem relates a line integral around a simple closed curve C to a double integral over the plane region D bounded by C. It is the two-dimensional special case of Stokes' theorem.

<span class="mw-page-title-main">Divergence of the sum of the reciprocals of the primes</span> Theorem

The sum of the reciprocals of all prime numbers diverges; that is:

<span class="mw-page-title-main">Jensen's inequality</span> Theorem of convex functions

In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906, building on an earlier proof of the same inequality for doubly-differentiable functions by Otto Hölder in 1889. Given its generality, the inequality appears in many forms depending on the context, some of which are presented below. In its simplest form the inequality states that the convex transformation of a mean is less than or equal to the mean applied after convex transformation; it is a simple corollary that the opposite is true of concave transformations.

In probability theory, the Azuma–Hoeffding inequality gives a concentration result for the values of martingales that have bounded differences.

<span class="mw-page-title-main">Integral test for convergence</span> Test for infinite series of monotonous terms for convergence

In mathematics, the integral test for convergence is a method used to test infinite series of monotonous terms for convergence. It was developed by Colin Maclaurin and Augustin-Louis Cauchy and is sometimes known as the Maclaurin–Cauchy test.

In mathematics, the rearrangement inequality states that

<span class="mw-page-title-main">Chebyshev function</span>

In mathematics, the Chebyshev function is either a scalarising function or one of two related functions. The first Chebyshev functionϑ(x) or θ(x) is given by

Hardy's inequality is an inequality in mathematics, named after G. H. Hardy. It states that if is a sequence of non-negative real numbers, then for every real number p > 1 one has

In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2, then for every positive ,

Carleman's inequality is an inequality in mathematics, named after Torsten Carleman, who proved it in 1923 and used it to prove the Denjoy–Carleman theorem on quasi-analytic classes.

In probability theory, the multidimensional Chebyshev's inequality is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.

In mathematics, a transformation of a sequence's generating function provides a method of converting the generating function for one sequence into a generating function enumerating another. These transformations typically involve integral formulas applied to a sequence generating function or weighted sums over the higher-order derivatives of these functions.

In mathematics, Young's inequality for products is a mathematical inequality about the product of two numbers. The inequality is named after William Henry Young and should not be confused with Young's convolution inequality.