Mixed volume

Last updated

In mathematics, more specifically, in convex geometry, the mixed volume is a way to associate a non-negative number to a tuple of convex bodies in . This number depends on the size and shape of the bodies, and their relative orientation to each other.

Contents

Definition

Let be convex bodies in and consider the function

where stands for the -dimensional volume, and its argument is the Minkowski sum of the scaled convex bodies . One can show that is a homogeneous polynomial of degree , so can be written as

where the functions are symmetric. For a particular index function , the coefficient is called the mixed volume of .

Properties

  1. ;
  2. is symmetric in its arguments;
  3. is multilinear: for .
Numerous geometric inequalities, such as the Brunn–Minkowski inequality for convex bodies and Minkowski's first inequality, are special cases of the AlexandrovFenchel inequality.

Quermassintegrals

Let be a convex body and let be the Euclidean ball of unit radius. The mixed volume

is called the j-th quermassintegral of . [1]

The definition of mixed volume yields the Steiner formula (named after Jakob Steiner):

Intrinsic volumes

The j-th intrinsic volume of is a different normalization of the quermassintegral, defined by

or in other words

where is the volume of the -dimensional unit ball.

Hadwiger's characterization theorem

Hadwiger's theorem asserts that every valuation on convex bodies in that is continuous and invariant under rigid motions of is a linear combination of the quermassintegrals (or, equivalently, of the intrinsic volumes). [2]

Notes

  1. McMullen, Peter (1991). "Inequalities between intrinsic volumes". Monatshefte für Mathematik. 111 (1): 47–53. doi: 10.1007/bf01299276 . MR   1089383.
  2. Klain, Daniel A. (1995). "A short proof of Hadwiger's characterization theorem". Mathematika . 42 (2): 329–339. doi:10.1112/s0025579300014625. MR   1376731.

Burago, Yu.D. (2001) [1994], "Mixed-volume theory", Encyclopedia of Mathematics , EMS Press

Related Research Articles

In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.

In mathematical analysis, the Minkowski inequality establishes that the Lp spaces are normed vector spaces. Let be a measure space, let and let and be elements of Then is in and we have the triangle inequality

<span class="mw-page-title-main">Minkowski addition</span> Sums vector sets A and B by adding each vector in A to each vector in B

In geometry, the Minkowski sum of two sets of position vectors A and B in Euclidean space is formed by adding each vector in A to each vector in B:

In integral geometry, Hadwiger's theorem characterises the valuations on convex bodies in It was proved by Hugo Hadwiger.

In mathematics and mathematical optimization, the convex conjugate of a function is a generalization of the Legendre transformation which applies to non-convex functions. It is also known as Legendre–Fenchel transformation, Fenchel transformation, or Fenchel conjugate. It allows in particular for a far reaching generalization of Lagrangian duality.

Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets. Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.

In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class, then the distribution with the largest entropy should be chosen as the least-informative default. The motivation is twofold: first, maximizing entropy minimizes the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time.

In mathematical optimization theory, duality or the duality principle is the principle that optimization problems may be viewed from either of two perspectives, the primal problem or the dual problem. If the primal is a minimization problem then the dual is a maximization problem. Any feasible solution to the primal (minimization) problem is at least as large as any feasible solution to the dual (maximization) problem. Therefore, the solution to the primal is an upper bound to the solution of the dual, and the solution of the dual is a lower bound to the solution of the primal. This fact is called weak duality.

In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy.

In mathematics, the Minkowski–Steiner formula is a formula relating the surface area and volume of compact subsets of Euclidean space. More precisely, it defines the surface area as the "derivative" of enclosed volume in an appropriate sense.

In mathematics, the Brunn–Minkowski theorem is an inequality relating the volumes of compact subsets of Euclidean space. The original version of the Brunn–Minkowski theorem applied to convex sets; the generalization to compact nonconvex sets stated here is due to Lazar Lyusternik (1935).

In mathematics, a Borel measure μ on n-dimensional Euclidean space is called logarithmically concave if, for any compact subsets A and B of and 0 < λ < 1, one has

In mathematics, the Prékopa–Leindler inequality is an integral inequality closely related to the reverse Young's inequality, the Brunn–Minkowski inequality and a number of other important and classical inequalities in analysis. The result is named after the Hungarian mathematicians András Prékopa and László Leindler.

In mathematics, the Borell–Brascamp–Lieb inequality is an integral inequality due to many different mathematicians but named after Christer Borell, Herm Jan Brascamp and Elliott Lieb.

In mathematics, geometric measure theory (GMT) is the study of geometric properties of sets through measure theory. It allows mathematicians to extend tools from differential geometry to a much larger class of surfaces that are not necessarily smooth.

In mathematics, particularly linear algebra, the Schur–Horn theorem, named after Issai Schur and Alfred Horn, characterizes the diagonal of a Hermitian matrix with given eigenvalues. It has inspired investigations and substantial generalizations in the setting of symplectic geometry. A few important generalizations are Kostant's convexity theorem, Atiyah–Guillemin–Sternberg convexity theorem, Kirwan convexity theorem.

In mathematics, Welch bounds are a family of inequalities pertinent to the problem of evenly spreading a set of unit vectors in a vector space. The bounds are important tools in the design and analysis of certain methods in telecommunication engineering, particularly in coding theory. The bounds were originally published in a 1974 paper by L. R. Welch.

In mathematics, Minkowski's second theorem is a result in the geometry of numbers about the values taken by a norm on a lattice and the volume of its fundamental cell.

In mathematics, there are many kinds of inequalities involving matrices and linear operators on Hilbert spaces. This article covers some important operator inequalities connected with traces of matrices.

In mathematics and physics, Lieb–Thirring inequalities provide an upper bound on the sums of powers of the negative eigenvalues of a Schrödinger operator in terms of integrals of the potential. They are named after E. H. Lieb and W. E. Thirring.