Mahler's inequality

Last updated

In mathematics, Mahler's inequality, named after Kurt Mahler, states that the geometric mean of the term-by-term sum of two finite sequences of positive numbers is greater than or equal to the sum of their two separate geometric means:

Contents

when xk, yk > 0 for all k.

Proof

By the inequality of arithmetic and geometric means, we have:

and

Hence,

Clearing denominators then gives the desired result.

See also

Related Research Articles

Absolute value Magnitude of a possibly negative number

In mathematics, the absolute value or modulus of a real number x, denoted |x|, is the non-negative value of x without regard to its sign. Namely, |x| = x if x is positive, and |x| = −x if x is negative, and |0| = 0. For example, the absolute value of 3 is 3, and the absolute value of −3 is also 3. The absolute value of a number may be thought of as its distance from zero.

Convex set In geometry, set that intersects every line into a single line segment

In geometry, a subset of a Euclidean space, or more generally an affine space over the reals, is convex if, given any two points in the subset, the subset contains the whole line segment that joins them. Equivalently, a convex set or a convex region is a subset that intersects every line into a single line segment . For example, a solid cube is a convex set, but anything that is hollow or has an indent, for example, a crescent shape, is not convex.

Geometric mean N-th root of the product of n numbers

In mathematics, the geometric mean is a mean or average, which indicates the central tendency or typical value of a set of numbers by using the product of their values. The geometric mean is defined as the nth root of the product of n numbers, i.e., for a set of numbers x1, x2, ..., xn, the geometric mean is defined as

In mathematics, generalized means are a family of functions for aggregating sets of numbers. These include as special cases the Pythagorean means.

In mathematics, the harmonic mean is one of several kinds of average, and in particular, one of the Pythagorean means. Typically, it is appropriate for situations when the average rate is desired.

Triangle inequality property of geometry, also used to generalize the notion of "distance" in metric spaces

In mathematics, the triangle inequality states that for any triangle, the sum of the lengths of any two sides must be greater than or equal to the length of the remaining side. This statement permits the inclusion of degenerate triangles, but some authors, especially those writing about elementary geometry, will exclude this possibility, thus leaving out the possibility of equality. If x, y, and z are the lengths of the sides of the triangle, with no side being greater than z, then the triangle inequality states that

In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.

In mathematical analysis, the Minkowski inequality establishes that the Lp spaces are normed vector spaces. Let S be a measure space, let 1 ≤ p < ∞ and let f and g be elements of Lp(S). Then f + g is in Lp(S), and we have the triangle inequality

Squeeze theorem On calculating limits by bounding a function between two other functions

In calculus, the squeeze theorem, also known as the pinching theorem, the sandwich theorem, the sandwich rule, the police theorem, the between theorem and sometimes the squeeze lemma, is a theorem regarding the limit of a function. In Italy, the theorem is also known as theorem of carabinieri.

In probability theory, the Azuma–Hoeffding inequality gives a concentration result for the values of martingales that have bounded differences.

Minkowski addition Sums vector sets A and B by adding each vector in A to each vector in B

In geometry, the Minkowski sum of two sets of position vectors A and B in Euclidean space is formed by adding each vector in A to each vector in B, i.e., the set

Inequality of arithmetic and geometric means

In mathematics, the inequality of arithmetic and geometric means, or more briefly the AM–GM inequality, states that the arithmetic mean of a list of non-negative real numbers is greater than or equal to the geometric mean of the same list; and further, that the two means are equal if and only if every number in the list is the same.

In mathematics, Hadamard's inequality is a result first published by Jacques Hadamard in 1893. It is a bound on the determinant of a matrix whose entries are complex numbers in terms of the lengths of its column vectors. In geometrical terms, when restricted to real numbers, it bounds the volume in Euclidean space of n dimensions marked out by n vectors vi for 1 ≤ in in terms of the lengths of these vectors ||vi||.

Singular value

In mathematics, in particular functional analysis, the singular values, or s-numbers of a compact operator T : XY acting between Hilbert spaces X and Y, are the square roots of non-negative eigenvalues of the self-adjoint operator T*T.

Real coordinate space Space formed by the n-tuples of real numbers

In mathematics, a real coordinate space of dimension n, written Rn or , is a coordinate space over the real numbers. This means that it is the set of the n-tuples of real numbers. With component-wise addition and scalar multiplication, it is a real vector space.

In mathematics, the Riesz–Fischer theorem in real analysis is any of a number of closely related results concerning the properties of the space L2 of square integrable functions. The theorem was proven independently in 1907 by Frigyes Riesz and Ernst Sigismund Fischer.

In mathematics, there are two different results that share the common name of the Ky Fan inequality. One is an inequality involving the geometric mean and arithmetic mean of two sets of real numbers of the unit interval. The result was published on page 5 of the book Inequalities by Edwin F. Beckenbach and Richard E. Bellman (1961), who refer to an unpublished result of Ky Fan. They mention the result in connection with the inequality of arithmetic and geometric means and Augustin Louis Cauchy's proof of this inequality by forward-backward-induction; a method which can also be used to prove the Ky Fan inequality.

Carleman's inequality is an inequality in mathematics, named after Torsten Carleman, who proved it in 1923 and used it to prove the Denjoy–Carleman theorem on quasi-analytic classes.

In mathematics, the Brascamp–Lieb inequality is either of two inequalities. The first is a result in geometry concerning integrable functions on n-dimensional Euclidean space . It generalizes the Loomis–Whitney inequality and Hölder's inequality. The second is a result of probability theory which gives a concentration inequality for log-concave probability distributions. Both are named after Herm Jan Brascamp and Elliott H. Lieb.

In mathematics, the Hermite–Hadamard inequality, named after Charles Hermite and Jacques Hadamard and sometimes also called Hadamard's inequality, states that if a function ƒ : [ab] → R is convex, then the following chain of inequalities hold:

References