List of inequalities

Last updated

This article lists Wikipedia articles about named mathematical inequalities.

Contents

Inequalities in pure mathematics

Analysis

Inequalities relating to means

Combinatorics

Differential equations

Geometry

Information theory

Algebra

Linear algebra

Eigenvalue inequalities

Number theory

Probability theory and statistics

Topology

Inequalities particular to physics

See also

Related Research Articles

In probability theory, Chebyshev's inequality guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be k or more standard deviations away from the mean. The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.

In mathematics, the isoperimetric inequality is a geometric inequality involving the perimeter of a set and its volume. In -dimensional space the inequality lower bounds the surface area or perimeter of a set by its volume ,

In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function. The minimum of all such exponential bounds forms the Chernoff or Chernoff-Cramér bound, which may decay faster than exponential. It is especially useful for sums of independent random variables, such as sums of Bernoulli random variables.

In mathematics, the Marcinkiewicz interpolation theorem, discovered by Józef Marcinkiewicz (1939), is a result bounding the norms of non-linear operators acting on Lp spaces.

<span class="mw-page-title-main">Elliott H. Lieb</span> American mathematical physicist

Elliott Hershel Lieb is an American mathematical physicist and professor of mathematics and physics at Princeton University who specializes in statistical mechanics, condensed matter theory, and functional analysis.

Raymond Edward Alan Christopher Paley was an English mathematician who made significant contributions to mathematical analysis before dying young in a skiing accident.

In mathematics, the Brunn–Minkowski theorem is an inequality relating the volumes of compact subsets of Euclidean space. The original version of the Brunn–Minkowski theorem applied to convex sets; the generalization to compact nonconvex sets stated here is due to Lazar Lyusternik (1935).

In mathematics, the Gaussian isoperimetric inequality, proved by Boris Tsirelson and Vladimir Sudakov, and later independently by Christer Borell, states that among all sets of given Gaussian measure in the n-dimensional Euclidean space, half-spaces have the minimal Gaussian boundary measure.

In mathematics, the Hardy–Littlewood maximal operatorM is a significant non-linear operator used in real analysis and harmonic analysis.

In mathematics, the Prékopa–Leindler inequality is an integral inequality closely related to the reverse Young's inequality, the Brunn–Minkowski inequality and a number of other important and classical inequalities in analysis. The result is named after the Hungarian mathematicians András Prékopa and László Leindler.

In mathematics, particularly, in asymptotic convex geometry, Milman's reverse Brunn–Minkowski inequality is a result due to Vitali Milman that provides a reverse inequality to the famous Brunn–Minkowski inequality for convex bodies in n-dimensional Euclidean space Rn. Namely, it bounds the volume of the Minkowski sum of two bodies from above in terms of the volumes of the bodies.

In mathematics, the Borell–Brascamp–Lieb inequality is an integral inequality due to many different mathematicians but named after Christer Borell, Herm Jan Brascamp and Elliott Lieb.

In mathematics, the Brascamp–Lieb inequality is either of two inequalities. The first is a result in geometry concerning integrable functions on n-dimensional Euclidean space . It generalizes the Loomis–Whitney inequality and Hölder's inequality. The second is a result of probability theory which gives a concentration inequality for log-concave probability distributions. Both are named after Herm Jan Brascamp and Elliott H. Lieb.

This page lists articles related to probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form (X:Y), which refers to number of random variables involved and the type of the distribution. For example (2:DC) indicates a distribution with two random variables, discrete or continuous. Other codes are just abbreviations for topics. The list of codes can be found in the table of contents.

In mathematics, the symmetric decreasing rearrangement of a function is a function which is symmetric and decreasing, and whose level sets are of the same size as those of the original function.

In probability theory, Cantelli's inequality is an improved version of Chebyshev's inequality for one-sided tail bounds. The inequality states that, for

In mathematics, the Riesz rearrangement inequality, sometimes called Riesz–Sobolev inequality, states that any three non-negative functions , and satisfy the inequality

In mathematics, the Cartan–Hadamard conjecture is a fundamental problem in Riemannian geometry and Geometric measure theory which states that the classical isoperimetric inequality may be generalized to spaces of nonpositive sectional curvature, known as Cartan–Hadamard manifolds. The conjecture, which is named after French mathematicians Élie Cartan and Jacques Hadamard, may be traced back to work of André Weil in 1926.