SOS-convexity

Last updated

A multivariate polynomial is SOS-convex (or sum of squares convex) if its Hessian matrix H can be factored as H(x) = ST(x)S(x) where S is a matrix (possibly rectangular) which entries are polynomials in x. [1] In other words, the Hessian matrix is a SOS matrix polynomial.

Contents

An equivalent definition is that the form defined as g(x,y) = yTH(x)y is a sum of squares of forms. [2]

Connection with convexity

If a polynomial is SOS-convex, then it is also convex.[ citation needed ] Since establishing whether a polynomial is SOS-convex amounts to solving a semidefinite programming problem, SOS-convexity can be used as a proxy to establishing if a polynomial is convex. In contrast, deciding if a generic quartic polynomial of degree four (or higher even degree) is convex is a NP-hard problem. [3]

The first counterexample of a polynomial which is convex but not SOS-convex was constructed by Amir Ali Ahmadi and Pablo Parrilo in 2009. [4] The polynomial is a homogeneous polynomial that is sum-of-squares and given by: [4]

In the same year, Grigoriy Blekherman proved in a non-constructive manner that there exist convex forms that is not representable as sum of squares. [5] An explicit example of a convex form (with degree 4 and 272 variables) that is not a sum of squares was claimed by James Saunderson in 2021. [6]

Connection with non-negativity and sum-of-squares

In 2013 Amir Ali Ahmadi and Pablo Parrilo showed that every convex homogeneous polynomial in n variables and degree 2d is SOS-convex if and only if either (a) n = 2 or (b) 2d = 2 or (c) n = 3 and 2d = 4. [7] Impressively, the same relation is valid for non-negative homogeneous polynomial in n variables and degree 2d that can be represented as sum of squares polynomials (See Hilbert's seventeenth problem).

Related Research Articles

Quadratic programming (QP) is the process of solving certain mathematical optimization problems involving quadratic functions. Specifically, one seeks to optimize a multivariate quadratic function subject to linear constraints on the variables. Quadratic programming is a type of nonlinear programming.

<span class="mw-page-title-main">Convex hull</span> Smallest convex set containing a given set

In geometry, the convex hull, convex envelope or convex closure of a shape is the smallest convex set that contains it. The convex hull may be defined either as the intersection of all convex sets containing a given subset of a Euclidean space, or equivalently as the set of all convex combinations of points in the subset. For a bounded subset of the plane, the convex hull may be visualized as the shape enclosed by a rubber band stretched around the subset.

In mathematics, an integral polytope has an associated Ehrhart polynomial that encodes the relationship between the volume of a polytope and the number of integer points the polytope contains. The theory of Ehrhart polynomials can be seen as a higher-dimensional generalization of Pick's theorem in the Euclidean plane.

In geometry, the kissing number of a mathematical space is defined as the greatest number of non-overlapping unit spheres that can be arranged in that space such that they each touch a common unit sphere. For a given sphere packing in a given space, a kissing number can also be defined for each individual sphere as the number of spheres it touches. For a lattice packing the kissing number is the same for every sphere, but for an arbitrary sphere packing the kissing number may vary from one sphere to another.

Hilbert's seventeenth problem is one of the 23 Hilbert problems set out in a celebrated list compiled in 1900 by David Hilbert. It concerns the expression of positive definite rational functions as sums of quotients of squares. The original question may be reformulated as:

In mathematics, specifically statistics and information geometry, a Bregman divergence or Bregman distance is a measure of difference between two points, defined in terms of a strictly convex function; they form an important class of divergences. When the points are interpreted as probability distributions – notably as either values of the parameter of a parametric model or as a data set of observed values – the resulting distance is a statistical distance. The most basic Bregman divergence is the squared Euclidean distance.

Semidefinite programming (SDP) is a subfield of mathematical programming concerned with the optimization of a linear objective function over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron.

In mathematics, the Grothendieck inequality states that there is a universal constant with the following property. If Mij is an n × n matrix with

In the mathematical field of linear algebra and convex analysis, the numerical range or field of values of a complex matrix A is the set

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms of degree m such that

In mathematics, a positive polynomial on a particular set is a polynomial whose values are positive on that set. Precisely, Let be a polynomial in variables with real coefficients and let be a subset of the -dimensional Euclidean space . We say that:

<span class="mw-page-title-main">Convex curve</span> Type of plane curve

In geometry, a convex curve is a plane curve that has a supporting line through each of its points. There are many other equivalent definitions of these curves, going back to Archimedes. Examples of convex curves include the convex polygons, the boundaries of convex sets, and the graphs of convex functions. Important subclasses of convex curves include the closed convex curves, the smooth curves that are convex, and the strictly convex curves, which have the additional property that each supporting line passes through a unique point of the curve.

A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the Lasserre hierarchy of relaxations in semidefinite programming.

In mathematics, k-Hessian equations are partial differential equations (PDEs) based on the Hessian matrix. More specifically, a Hessian equation is the k-trace, or the kth elementary symmetric polynomial of eigenvalues of the Hessian matrix. When k ≥ 2, the k-Hessian equation is a fully nonlinear partial differential equation. It can be written as , where , , and , are the eigenvalues of the Hessian matrix and , is a th elementary symmetric polynomial.

<span class="mw-page-title-main">Point-set registration</span> Process of finding a spatial transformation that aligns two point clouds

In computer vision, pattern recognition, and robotics, point-set registration, also known as point-cloud registration or scan matching, is the process of finding a spatial transformation that aligns two point clouds. The purpose of finding such a transformation includes merging multiple data sets into a globally consistent model, and mapping a new measurement to a known data set to identify features or to estimate its pose. Raw 3D point cloud data are typically obtained from Lidars and RGB-D cameras. 3D point clouds can also be generated from computer vision algorithms such as triangulation, bundle adjustment, and more recently, monocular image depth estimation using deep learning. For 2D point set registration used in image processing and feature-based image registration, a point set may be 2D pixel coordinates obtained by feature extraction from an image, for example corner detection. Point cloud registration has extensive applications in autonomous driving, motion estimation and 3D reconstruction, object detection and pose estimation, robotic manipulation, simultaneous localization and mapping (SLAM), panorama stitching, virtual and augmented reality, and medical imaging.

<span class="mw-page-title-main">Eva Kallin</span> American mathematician

Eva Marianne Kallin Pohlmann is a professor emerita of mathematics at Brown University. Her research concerns function algebras, polynomial convexity, and Tarski's axioms for Euclidean geometry.

αΒΒ is a second-order deterministic global optimization algorithm for finding the optima of general, twice continuously differentiable functions. The algorithm is based around creating a relaxation for nonlinear functions of general form by superposing them with a quadratic of sufficient magnitude, called α, such that the resulting superposition is enough to overcome the worst-case scenario of non-convexity of the original function. Since a quadratic has a diagonal Hessian matrix, this superposition essentially adds a number to all diagonal elements of the original Hessian, such that the resulting Hessian is positive-semidefinite. Thus, the resulting relaxation is a convex function.

Adrian Stephen Lewis is a British-Canadian mathematician, specializing in variational analysis and nonsmooth optimization.

In convex geometry and polyhedral combinatorics, the extension complexity of a convex polytope is the smallest number of facets among convex polytopes that have as a projection. In this context, is called an extended formulation of ; it may have much higher dimension than .

References

  1. Helton, J. William; Nie, Jiawang (2010). "Semidefinite representation of convex sets". Mathematical Programming. 122 (1): 21–64. arXiv: 0705.4068 . doi:10.1007/s10107-008-0240-y. ISSN   0025-5610. S2CID   1352703.
  2. Ahmadi, Amir Ali; Parrilo, Pablo A. (2010). "On the equivalence of algebraic conditions for convexity and quasiconvexity of polynomials". 49th IEEE Conference on Decision and Control (CDC). pp. 3343–3348. doi:10.1109/CDC.2010.5717510. hdl: 1721.1/74151 . ISBN   978-1-4244-7745-6. S2CID   11904851.
  3. Ahmadi, Amir Ali; Olshevsky, Alex; Parrilo, Pablo A.; Tsitsiklis, John N. (2013). "NP-hardness of deciding convexity of quartic polynomials and related problems". Mathematical Programming. 137 (1–2): 453–476. arXiv: 1012.1908 . doi:10.1007/s10107-011-0499-2. ISSN   0025-5610. S2CID   2319461.
  4. 1 2 Ahmadi, Amir Ali; Parrilo, Pablo A. (2009). "A positive definite polynomial Hessian that does not factor". Proceedings of the 48h IEEE Conference on Decision and Control (CDC) held jointly with 2009 28th Chinese Control Conference. pp. 1195–1200. doi:10.1109/CDC.2009.5400519. hdl: 1721.1/58871 . ISBN   978-1-4244-3871-6. S2CID   5344338.
  5. Blekherman, Grigoriy (2009-10-04). "Convex Forms that are not Sums of Squares". arXiv: 0910.0656 [math.AG].
  6. Saunderson, James (2022). "A Convex Form That is Not a Sum of Squares". Mathematics of Operations Research . 48: 569–582. arXiv: 2105.08432 . doi:10.1287/moor.2022.1273. S2CID   234763204.
  7. Ahmadi, Amir Ali; Parrilo, Pablo A. (2013). "A Complete Characterization of the Gap between Convexity and SOS-Convexity". SIAM Journal on Optimization. 23 (2): 811–833. arXiv: 1111.4587 . doi:10.1137/110856010. ISSN   1052-6234. S2CID   16801871.

See also