Geometric programming

Last updated

A geometric program (GP) is an optimization problem of the form

Contents

where are posynomials and are monomials. In the context of geometric programming (unlike standard mathematics), a monomial is a function from to defined as

where and . A posynomial is any sum of monomials. [1] [2]

Geometric programming is closely related to convex optimization: any GP can be made convex by means of a change of variables. [2] GPs have numerous applications, including component sizing in IC design, [3] [4] aircraft design, [5] maximum likelihood estimation for logistic regression in statistics, and parameter tuning of positive linear systems in control theory. [6]

Convex form

Geometric programs are not in general convex optimization problems, but they can be transformed to convex problems by a change of variables and a transformation of the objective and constraint functions. In particular, after performing the change of variables and taking the log of the objective and constraint functions, the functions , i.e., the posynomials, are transformed into log-sum-exp functions, which are convex, and the functions , i.e., the monomials, become affine. Hence, this transformation transforms every GP into an equivalent convex program. [2] In fact, this log-log transformation can be used to convert a larger class of problems, known as log-log convex programming (LLCP), into an equivalent convex form. [7]

Software

Several software packages exist to assist with formulating and solving geometric programs.

See also

Related Research Articles

Quadratic programming (QP) is the process of solving certain mathematical optimization problems involving quadratic functions. Specifically, one seeks to optimize a multivariate quadratic function subject to linear constraints on the variables. Quadratic programming is a type of nonlinear programming.

Mathematical optimization Study of mathematical algorithms for optimization problems

Mathematical optimization or mathematical programming is the selection of a best element from some set of available alternatives. Optimization problems of sorts arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.

The assignment problem is a fundamental combinatorial optimization problem. In its most general form, the problem is as follows:

Tropical geometry Skeletonized version of algebraic geometry

In mathematics, tropical geometry is the study of polynomials and their geometric properties when addition is replaced with minimization and multiplication is replaced with ordinary addition:

In mathematics, a function is said to be closed if for each , the sublevel set is a closed set.

Interior-point method

Interior-point methods are a certain class of algorithms that solve linear and nonlinear convex optimization problems.

Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets. Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions, also known as the Kuhn–Tucker conditions, are first derivative tests for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied.

In mathematical optimization theory, duality or the duality principle is the principle that optimization problems may be viewed from either of two perspectives, the primal problem or the dual problem. The solution to the dual problem provides a lower bound to the solution of the primal (minimization) problem. However in general the optimal values of the primal and dual problems need not be equal. Their difference is called the duality gap. For convex optimization problems, the duality gap is zero under a constraint qualification condition.

In mathematical optimization, constrained optimization is the process of optimizing an objective function with respect to some variables in the presence of constraints on those variables. The objective function is either a cost function or energy function, which is to be minimized, or a reward function or utility function, which is to be maximized. Constraints can be either hard constraints, which set conditions for the variables that are required to be satisfied, or soft constraints, which have some variable values that are penalized in the objective function if, and based on the extent that, the conditions on the variables are not satisfied.

Ellipsoid method

In mathematical optimization, the ellipsoid method is an iterative method for minimizing convex functions. When specialized to solving feasible linear optimization problems with rational data, the ellipsoid method is an algorithm which finds an optimal solution in a finite number of steps.

Semidefinite programming (SDP) is a subfield of convex optimization concerned with the optimization of a linear objective function over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron.

In convex optimization, a linear matrix inequality (LMI) is an expression of the form

Quasiconvex function

In mathematics, a quasiconvex function is a real-valued function defined on an interval or on a convex subset of a real vector space such that the inverse image of any set of the form is a convex set. For a function of a single variable, along any stretch of the curve the highest point is one of the endpoints. The negative of a quasiconvex function is said to be quasiconcave.

A second-order cone program (SOCP) is a convex optimization problem of the form

Conic optimization is a subfield of convex optimization that studies problems consisting of minimizing a convex function over the intersection of an affine subspace and a convex cone.

A signomial is an algebraic function of one or more independent variables. It is perhaps most easily thought of as an algebraic extension of multivariable polynomials—an extension that permits exponents to be arbitrary real numbers while requiring the independent variables to be strictly positive.

A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the Lasserre hierarchy of relaxations in semidefinite programming.

In applied mathematics, Graver bases enable iterative solutions of linear and various nonlinear integer programming problems in polynomial time. They were introduced by Jack E. Graver. Their connection to the theory of Gröbner bases was discussed by Bernd Sturmfels. The algorithmic theory of Graver bases and its application to integer programming is described by Shmuel Onn.

Structured sparsity regularization is a class of methods, and an area of research in statistical learning theory, that extend and generalize sparsity regularization learning methods. Both sparsity and structured sparsity regularization methods seek to exploit the assumption that the output variable to be learned can be described by a reduced number of variables in the input space . Sparsity regularization methods focus on selecting the input variables that best describe the output. Structured sparsity regularization methods generalize and extend sparsity regularization methods, by allowing for optimal selection over structures like groups or networks of input variables in .

References

  1. Richard J. Duffin; Elmor L. Peterson; Clarence Zener (1967). Geometric Programming. John Wiley and Sons. p. 278. ISBN   0-471-22370-0.
  2. 1 2 3 S. Boyd, S. J. Kim, L. Vandenberghe, and A. Hassibi. A Tutorial on Geometric Programming. Retrieved 20 October 2019.
  3. M. Hershenson, S. Boyd, and T. Lee. Optimal Design of a CMOS Op-amp via Geometric Programming. Retrieved 8 January 2019.
  4. S. Boyd, S. J. Kim, D. Patil, and M. Horowitz. Digital Circuit Optimization via Geometric Programming. Retrieved 20 October 2019.
  5. W. Hoburg and P. Abbeel. Geometric programming for aircraft design optimization. AIAA Journal 52.11 (2014): 2414-2426.
  6. Ogura, Masaki; Kishida, Masako; Lam, James (2020). "Geometric Programming for Optimal Positive Linear Systems". IEEE Transactions on Automatic Control. 65 (11): 4648–4663. arXiv: 1904.12976 . doi:10.1109/TAC.2019.2960697. ISSN   0018-9286.
  7. 1 2 A. Agrawal, S. Diamond, and S. Boyd. Disciplined Geometric Programming. Retrieved 8 January 2019.