In discrete optimization, a special ordered set (SOS) is an ordered set of variables used as an additional way to specify integrality conditions in an optimization model. Special order sets are basically a device or tool used in branch and bound methods for branching on sets of variables, rather than individual variables, as in ordinary mixed integer programming. Knowing that a variable is part of a set and that it is ordered gives the branch and bound algorithm a more intelligent way to face the optimization problem, helping to speed up the search procedure. The members of a special ordered set individually may be continuous or discrete variables in any combination. However, even when all the members are themselves continuous, a model containing one or more special ordered sets becomes a discrete optimization problem requiring a mixed integer optimizer for its solution.
The ‘only’ benefit of using Special Ordered Sets compared with using only constraints is that the search procedure will generally be noticeably faster. [1] As per J.A. Tomlin, Special Order Sets provide a powerful means of modeling nonconvex functions and discrete requirements, though there has been a tendency to think of them only in terms of multiple-choice zero-one programming. [2]
The origin of the concept was in the paper of Beale titled "Two transportation problems" (1963) [3] where he presented a bid evaluation model. However, the term was first explicitly introduced by Beale and Tomlin (1970). [4] The special order set were first implemented in Scicon's UMPIRE mathematical programming system. [5]
Special Order sets were an important and recurring theme in Martin Beale's work, and their value came to be recognized to the point where nearly all production mathematical programming systems (MPS's) implement some version, or subset, of SOS. [6]
There are two sorts of Special Ordered Sets:
Discrete mathematics is the study of mathematical structures that can be considered "discrete" rather than "continuous". Objects studied in discrete mathematics include integers, graphs, and statements in logic. By contrast, discrete mathematics excludes topics in "continuous mathematics" such as real numbers, calculus or Euclidean geometry. Discrete objects can often be enumerated by integers; more formally, discrete mathematics has been characterized as the branch of mathematics dealing with countable sets. However, there is no exact definition of the term "discrete mathematics".
Linear programming (LP), also called linear optimization, is a method to achieve the best outcome in a mathematical model whose requirements are represented by linear relationships. Linear programming is a special case of mathematical programming.
Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.
An integer programming problem is a mathematical optimization or feasibility program in which some or all of the variables are restricted to be integers. In many settings the term refers to integer linear programming (ILP), in which the objective function and the constraints are linear.
Discrete optimization is a branch of optimization in applied mathematics and computer science.
In mathematics, nonlinear programming (NLP) is the process of solving an optimization problem where some of the constraints or the objective function are nonlinear. An optimization problem is one of calculation of the extrema of an objective function over a set of unknown real variables and conditional to the satisfaction of a system of equalities and inequalities, collectively termed constraints. It is the sub-field of mathematical optimization that deals with problems that are not linear.
MPS is a file format for presenting and archiving linear programming (LP) and mixed integer programming problems.
Algebraic modeling languages (AML) are high-level computer programming languages for describing and solving high complexity problems for large scale mathematical computation. One particular advantage of some algebraic modeling languages like AIMMS, AMPL, GAMS, Gekko, MathProg, Mosel, and OPL is the similarity of their syntax to the mathematical notation of optimization problems. This allows for a very concise and readable definition of problems in the domain of optimization, which is supported by certain language elements like sets, indices, algebraic expressions, powerful sparse index and data handling variables, constraints with arbitrary names. The algebraic formulation of a model does not contain any hints how to process it.
In mathematical optimization, a quadratically constrained quadratic program (QCQP) is an optimization problem in which both the objective function and the constraints are quadratic functions. It has the form
Evelyn Martin Lansdowne Beale FRS was an applied mathematician and statistician who was one of the pioneers of mathematical programming.
In geometry and polyhedral combinatorics, an integral polytope is a convex polytope whose vertices all have integer Cartesian coordinates. That is, it is a polytope that equals the convex hull of its integer points. Integral polytopes are also called lattice polytopes or Z-polytopes. The special cases of two- and three-dimensional integral polytopes may be called polygons or polyhedra instead of polytopes, respectively.
Claude Lemaréchal is a French applied mathematician, and former senior researcher at INRIA near Grenoble, France.
Laurence Alexander Wolsey is an English mathematician working in the field of integer programming. He is a former president and research director of the Center for Operations Research and Econometrics (CORE) at Université catholique de Louvain in Belgium. He is professor emeritus of applied mathematics at the engineering school of the same university.
The FICO Xpress optimizer is a commercial optimization solver for linear programming (LP), mixed integer linear programming (MILP), convex quadratic programming (QP), convex quadratically constrained quadratic programming (QCQP), second-order cone programming (SOCP) and their mixed integer counterparts. Xpress includes a general purpose non-linear solver, Xpress NonLinear, including a successive linear programming algorithm, and Artelys Knitro.
LINDO is a software package for linear programming, integer programming, nonlinear programming, stochastic programming and global optimization.
Convex Over and Under ENvelopes for Nonlinear Estimation (Couenne) is an open-source library for solving global optimization problems, also termed mixed integer nonlinear optimization problems. A global optimization problem requires to minimize a function, called objective function, subject to a set of constraints. Both the objective function and the constraints might be nonlinear and nonconvex. For solving these problems, Couenne uses a reformulation procedure and provides a linear programming approximation of any nonconvex optimization problem.
In mathematics and statistics, a quantitative variable may be continuous or discrete if they are typically obtained by measuring or counting, respectively. If it can take on two particular real values such that it can also take on all real values between them, the variable is continuous in that interval. If it can take on a value such that there is a non-infinitesimal gap on each side of it containing no values that the variable can take on, then it is discrete around that value. In some contexts a variable can be discrete in some ranges of the number line and continuous in others.
Deterministic global optimization is a branch of numerical optimization which focuses on finding the global solutions of an optimization problem whilst providing theoretical guarantees that the reported solution is indeed the global one, within some predefined tolerance. The term "deterministic global optimization" typically refers to complete or rigorous optimization methods. Rigorous methods converge to the global optimum in finite time. Deterministic global optimization methods are typically used when locating the global solution is a necessity, when it is extremely difficult to find a feasible solution, or simply when the user desires to locate the best possible solution to a problem.