Etemadi's inequality

Last updated

In probability theory, Etemadi's inequality is a so-called "maximal inequality", an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The result is due to Nasrollah Etemadi.

Contents

Statement of the inequality

Let X1, ..., Xn be independent real-valued random variables defined on some common probability space, and let α ≥ 0. Let Sk denote the partial sum

Then

Remark

Suppose that the random variables Xk have common expected value zero. Apply Chebyshev's inequality to the right-hand side of Etemadi's inequality and replace α by α / 3. The result is Kolmogorov's inequality with an extra factor of 27 on the right-hand side:

Related Research Articles

In probability theory, the expected value of a random variable is a key aspect of its probability distribution. Intuitively, a random variable's expected value represents the average of a large number of independent realizations of the random variable. For example, the expected value of rolling a six-sided die is 3.5, because the average of all the numbers that come up converges to 3.5 as the number of rolls approaches infinity. The expected value is also known as the expectation, mathematical expectation, mean, or first moment.

In probability theory, Chebyshev's inequality guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be more than k standard deviations away from the mean. The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.

In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.

In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov, although it appeared earlier in the work of Pafnuty Chebyshev, and many sources, especially in analysis, refer to it as Chebyshev's inequality or Bienaymé's inequality.

In mathematics, Fatou's lemma establishes an inequality relating the Lebesgue integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions. The lemma is named after Pierre Fatou.

In probability theory, the factorial moment is a mathematical quantity defined as the expectation or average of the falling factorial of a random variable. Factorial moments are useful for studying non-negative integer-valued random variables, and arise in the use of probability-generating functions to derive the moments of discrete random variables.

Joint entropy Measure of information in probability and information theory

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. In the simplest cases, the result can be either a continuous or a discrete distribution.

In probability theory, Hoeffding's inequality provides an upper bound on the probability that the sum of bounded independent random variables deviates from its expected value by more than a certain amount. Hoeffding's inequality was proven by Wassily Hoeffding in 1963.

In probability theory, Wald's equation, Wald's identity or Wald's lemma is an important identity that simplifies the calculation of the expected value of the sum of a random number of random quantities. In its simplest form, it relates the expectation of a sum of randomly many finite-mean, independent and identically distributed random variables to the expected number of terms in the sum and the random variables' common expectation under the condition that the number of terms in the sum is independent of the summands.

In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min-entropy. Entropies quantify the diversity, uncertainty, or randomness of a system. The entropy is named after Alfréd Rényi. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions.

In mathematics, Grönwall's inequality allows one to bound a function that is known to satisfy a certain differential or integral inequality by the solution of the corresponding differential or integral equation. There are two forms of the lemma, a differential form and an integral form. For the latter there are several variants.

In probability theory and statistics, the Rademacher distribution is a discrete probability distribution where a random variate X has a 50% chance of being +1 and a 50% chance of being -1.

In probability theory, Kolmogorov's inequality is a so-called "maximal inequality" that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The inequality is named after the Russian mathematician Andrey Kolmogorov.

Dvoretzky–Kiefer–Wolfowitz inequality

In the theory of probability and statistics, the Dvoretzky–Kiefer–Wolfowitz inequality bounds how close an empirically determined distribution function will be to the distribution function from which the empirical samples are drawn. It is named after Aryeh Dvoretzky, Jack Kiefer, and Jacob Wolfowitz, who in 1956 proved the inequality with an unspecified multiplicative constant C in front of the exponent on the right-hand side. In 1990, Pascal Massart proved the inequality with the sharp constant C = 2, confirming a conjecture due to Birnbaum and McCarty.

In probability theory and statistics, a stochastic order quantifies the concept of one random variable being "bigger" than another. These are usually partial orders, so that one random variable may be neither stochastically greater than, less than nor equal to another random variable . Many different orders exist, which have different applications.

In mathematics, Khabibullin's conjecture, named after B. N. Khabibullin, is related to Paley's problem for plurisubharmonic functions and to various extremal problems in the theory of entire functions of several variables.

In probability theory, concentration inequalities provide bounds on how a random variable deviates from some value. The law of large numbers of classical probability theory states that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results show that such behavior is shared by other functions of independent random variables.

For certain applications in linear algebra, it is useful to know properties of the probability distribution of the largest eigenvalue of a finite sum of random matrices. Suppose is a finite sequence of random matrices. Analogous to the well-known Chernoff bound for sums of scalars, a bound on the following is sought for a given parameter t:

In probability theory, Cantelli's inequality is a generalization of Chebyshev's inequality in the case of a single "tail". The inequality states that

References