Parametric programming

Last updated

Parametric programming is a type of mathematical optimization, where the optimization problem is solved as a function of one or multiple parameters. [1] Developed in parallel to sensitivity analysis, its earliest mention can be found in a thesis from 1952. [2] Since then, there have been considerable developments for the cases of multiple parameters, presence of integer variables as well as nonlinearities.

Contents

Notation

In general, the following optimization problem is considered

where is the optimization variable, are the parameters, is the objective function and denote the constraints. denotes a function whose output is the optimal value of the objective function . The set is generally referred to as parameter space.

The optimal value (i.e. result of solving the optimization problem) is obtained by evaluating the function with an argument .

Classification

Depending on the nature of and and whether the optimization problem features integer variables, parametric programming problems are classified into different sub-classes:

Applications

In control theory generally and in process industries

The connection between parametric programming and model predictive control established in 2000 has contributed to an increased interest in the topic. [6] [7] Parametric programming supplies the idea that optimization problems can be parametrized as functions that can be evaluated (similar to a lookup table). This in turns allows the optimization algorithms in optimal controllers to be implemented as pre-computed (off-line) mathematical functions, which may in some cases be simpler and faster to evaluate than solving a full optimization problem on-line. This also opens up the possibility of creating optimal controllers on chips (MPC on chip [8] ). However, the off-line parametrization of optimal solutions runs into the curse of dimensionality as the number of possible solutions grows with the dimensionality and number of constraints in the problem.[ citation needed ]

In CNC programming

Parametric programming in the context of CNC (computer numerical control) is defining part-cutting cycles in terms of variables with reassignable values rather than via hardcoded/hardwired instances. An archetypically simple example is writing a program to machine a family of washers: there is often no need to write 15 programs for 15 members of the family with various hole diameters, outer diameters, thicknesses, and materials, when it is practical instead to write 1 program that calls various variables and reads their current values from a table of assignments. The program then instructs the machine slides and spindles to move to various positions at various velocities, accordingly, addressing not only the sizes of the part (i.e., OD, ID, thickness) but also even the speeds and feeds needed for any given material (e.g., low-carbon steel, high-carbon steel; stainless steel of whichever grade; bronze, brass, or aluminum of whichever grade; polymer of whichever type).

Related Research Articles

In mathematics, an equation is a formula that expresses the equality of two expressions, by connecting them with the equals sign =. The word equation and its cognates in other languages may have subtly different meanings; for example, in French an équation is defined as containing one or more variables, while in English, any well-formed formula consisting of two expressions related with an equals sign is an equation.

Quadratic programming (QP) is the process of solving certain mathematical optimization problems involving quadratic functions. Specifically, one seeks to optimize a multivariate quadratic function subject to linear constraints on the variables. Quadratic programming is a type of nonlinear programming.

<span class="mw-page-title-main">Mathematical optimization</span> Study of mathematical algorithms for optimization problems

Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems of sorts arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.

In mathematical optimization and decision theory, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite, in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy.

In mathematics, nonlinear programming (NLP) is the process of solving an optimization problem where some of the constraints or the objective function are nonlinear. An optimization problem is one of calculation of the extrema of an objective function over a set of unknown real variables and conditional to the satisfaction of a system of equalities and inequalities, collectively termed constraints. It is the sub-field of mathematical optimization that deals with problems that are not linear.

Model predictive control (MPC) is an advanced method of process control that is used to control a process while satisfying a set of constraints. It has been in use in the process industries in chemical plants and oil refineries since the 1980s. In recent years it has also been used in power system balancing models and in power electronics. Model predictive controllers rely on dynamic models of the process, most often linear empirical models obtained by system identification. The main advantage of MPC is the fact that it allows the current timeslot to be optimized, while keeping future timeslots in account. This is achieved by optimizing a finite time-horizon, but only implementing the current timeslot and then optimizing again, repeatedly, thus differing from a linear–quadratic regulator (LQR). Also MPC has the ability to anticipate future events and can take control actions accordingly. PID controllers do not have this predictive ability. MPC is nearly universally implemented as a digital control, although there is research into achieving faster response times with specially designed analog circuitry.

Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets. Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.

In Bayesian statistics, a maximum a posteriori probability (MAP) estimate is an estimate of an unknown quantity, that equals the mode of the posterior distribution. The MAP can be used to obtain a point estimate of an unobserved quantity on the basis of empirical data. It is closely related to the method of maximum likelihood (ML) estimation, but employs an augmented optimization objective which incorporates a prior distribution over the quantity one wants to estimate. MAP estimation can therefore be seen as a regularization of maximum likelihood estimation.

In mathematical optimization, constrained optimization is the process of optimizing an objective function with respect to some variables in the presence of constraints on those variables. The objective function is either a cost function or energy function, which is to be minimized, or a reward function or utility function, which is to be maximized. Constraints can be either hard constraints, which set conditions for the variables that are required to be satisfied, or soft constraints, which have some variable values that are penalized in the objective function if, and based on the extent that, the conditions on the variables are not satisfied.

In statistics, M-estimators are a broad class of extremum estimators for which the objective function is a sample average. Both non-linear least squares and maximum likelihood estimation are special cases of M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. The statistical procedure of evaluating an M-estimator on a data set is called M-estimation. 48 samples of robust M-estimators can be found in a recent review study.

The cross-entropy (CE) method is a Monte Carlo method for importance sampling and optimization. It is applicable to both combinatorial and continuous problems, with either a static or noisy objective.

Multi-objective optimization is an area of multiple criteria decision making that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective optimization has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives. Minimizing cost while maximizing comfort while buying a car, and maximizing performance whilst minimizing fuel consumption and emission of pollutants of a vehicle are examples of multi-objective optimization problems involving two and three objectives, respectively. In practical problems, there can be more than three objectives.

Advanced process monitor (APMonitor) is a modeling language for differential algebraic (DAE) equations. It is a free web-service or local server for solving representations of physical systems in the form of implicit DAE models. APMonitor is suited for large-scale problems and solves linear programming, integer programming, nonlinear programming, nonlinear mixed integer programming, dynamic simulation, moving horizon estimation, and nonlinear model predictive control. APMonitor does not solve the problems directly, but calls nonlinear programming solvers such as APOPT, BPOPT, IPOPT, MINOS, and SNOPT. The APMonitor API provides exact first and second derivatives of continuous functions to the solvers through automatic differentiation and in sparse matrix form.

In mathematics and its applications, a parametric family or a parameterized family is a family of objects whose differences depend only on the chosen values for a set of parameters.

In applied mathematics, Graver bases enable iterative solutions of linear and various nonlinear integer programming problems in polynomial time. They were introduced by Jack E. Graver. Their connection to the theory of Gröbner bases was discussed by Bernd Sturmfels. The algorithmic theory of Graver bases and its application to integer programming is described by Shmuel Onn.

Efstratios N. (Stratos) Pistikopoulos is a distinguished research professor at the Department of Chemical Engineering at Texas A&M University, as well as the director of the Texas A&M Energy Institute. From 1991-2015, he was a professor for chemical engineering at Imperial College, where he pioneered multi-parametric programming and invented the concept of explicit or multi-parametric model predictive control. He has authored and co/authored more than 350 peer reviewed journal articles, authored and/or edited 9 books and has been an invited speaker to many academic conferences and lectures, including the 21st Professor Roger W. H. Sargent lecture at Imperial College London entitled "Multi-Parametric Programming & Control 25 years later: what is next?". Additionally, Pistikopoulos has been elected a fellow of the Royal Academy of Engineering in 2013.

<span class="mw-page-title-main">Simulation-based optimization</span> Form of optimization

Simulation-based optimization integrates optimization techniques into simulation modeling and analysis. Because of the complexity of the simulation, the objective function may become difficult and expensive to evaluate. Usually, the underlying simulation model is stochastic, so that the objective function must be estimated using statistical estimation techniques.

The GEKKO Python package solves large-scale mixed-integer and differential algebraic equations with nonlinear programming solvers. Modes of operation include machine learning, data reconciliation, real-time optimization, dynamic simulation, and nonlinear model predictive control. In addition, the package solves Linear programming (LP), Quadratic programming (QP), Quadratically constrained quadratic program (QCQP), Nonlinear programming (NLP), Mixed integer programming (MIP), and Mixed integer linear programming (MILP). GEKKO is available in Python and installed with pip from PyPI of the Python Software Foundation.

In dual decomposition a problem is broken into smaller subproblems and a solution to the relaxed problem is found. This method can be employed for MRF optimization. Dual decomposition is applied to markov logic programs as an inference technique.

References

  1. Gal, Tomas (1995). Postoptimal Analyses, Parametric Programming, and Related Topics: Degeneracy, Multicriteria Decision Making, Redundancy (2nd ed.). Berlin: W. de Gruyter. ISBN   978-3-11-087120-3.
  2. Gal, Tomas; Greenberg, Harvey J. (1997). Advances in Sensitivity Analysis and Parametric Programming. International Series in Operations Research & Management Science. Vol. 6. Boston: Kluwer Academic Publishers. doi:10.1007/978-1-4615-6103-3. ISBN   978-0-7923-9917-9.
  3. Gal, Tomas; Nedoma, Josef (1972). "Multiparametric Linear Programming". Management Science. 18 (7): 406–422. doi:10.1287/mnsc.18.7.406. JSTOR   2629358.
  4. Dua, Vivek; Pistikopoulos, Efstratios N. (October 1999). "Algorithms for the Solution of Multiparametric Mixed-Integer Nonlinear Optimization Problems". Industrial & Engineering Chemistry Research. 38 (10): 3976–3987. doi:10.1021/ie980792u.
  5. Pistikopoulos, Efstratios N.; Georgiadis, Michael C.; Dua, Vivek (2007). Multi-parametric Programming Theory, Algorithms and Applications. Weinheim: Wiley-VCH. doi:10.1002/9783527631216. ISBN   9783527316915.
  6. Bemporad, Alberto; Morari, Manfred; Dua, Vivek; Pistikopoulos, Efstratios N. (2000). "The explicit solution of model predictive control via multiparametric quadratic programming". Proceedings of the 2000 American Control Conference. p. 872. doi:10.1109/ACC.2000.876624. ISBN   0-7803-5519-9. S2CID   1068816.
  7. Bemporad, Alberto; Morari, Manfred; Dua, Vivek; Pistikopoulos, Efstratios N. (January 2002). "The explicit linear quadratic regulator for constrained systems". Automatica. 38 (1): 3–20. CiteSeerX   10.1.1.67.2946 . doi:10.1016/S0005-1098(01)00174-1.
  8. MPC on a chip—Recent advances on the application of multi-parametric model-based control | Request PDF