In mathematical optimization, Zadeh's rule (also known as the least-entered rule) is an algorithmic refinement of the simplex method for linear optimization.
The rule was proposed around 1980 by Norman Zadeh (son of Lotfi A. Zadeh), and has entered the folklore of convex optimization since then. [1]
Zadeh offered a reward of $1,000 to anyone who can show that the rule admits polynomially many iterations or to prove that there is a family of linear programs on which the pivoting rule requires subexponentially many iterations to find the optimum. [2]
Zadeh's rule belongs to the family of history-based improvement rules which, during a run of the simplex algorithm, retain supplementary data in addition to the current basis of the linear program.
In particular, the rule chooses among all improving variables one which has entered the basis least often, intuitively ensuring that variables that might yield a substantive improvement in the long run but only a small improvement in a single step will be applied after a linear number of steps.
The supplementary data structure in Zadeh's algorithm can therefore be modeled as an occurrence record, mapping all variables to natural numbers, monitoring how often a particular variable has entered the basis. In every iteration, the algorithm then selects an improving variable that is minimal with respect to the retained occurrence record.
Note that the rule does not explicitly specify which particular improving variable should enter the basis in case of a tie.
Zadeh's rule has been shown to have at least super-polynomial time complexity in the worse-case by constructing a family of Markov decision processes on which the policy iteration algorithm requires a super-polynomial number of steps.
Running the simplex algorithm with Zadeh's rule on the induced linear program then yields a super-polynomial lower bound.
The result was presented at the "Efficiency of the Simplex Method: Quo vadis Hirsch conjecture?" IPAM workshop in 2011 by Oliver Friedmann. [3] Zadeh, although not working in academia anymore at that time, attended the Workshop and honored his original proposal. [4]
Linear programming is a method to achieve the best outcome in a mathematical model whose requirements are represented by linear relationships. Linear programming is a special case of mathematical programming.
Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. Optimization problems of sorts arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.
In mathematical optimization, Dantzig's simplex algorithm is a popular algorithm for linear programming.
An integer programming problem is a mathematical optimization or feasibility program in which some or all of the variables are restricted to be integers. In many settings the term refers to integer linear programming (ILP), in which the objective function and the constraints are linear.
The Fulkerson Prize for outstanding papers in the area of discrete mathematics is sponsored jointly by the Mathematical Optimization Society (MOS) and the American Mathematical Society (AMS). Up to three awards of $1,500 each are presented at each (triennial) International Symposium of the MOS. Originally, the prizes were paid out of a memorial fund administered by the AMS that was established by friends of the late Delbert Ray Fulkerson to encourage mathematical excellence in the fields of research exemplified by his work. The prizes are now funded by an endowment administered by MPS.
Interior-point methods are a certain class of algorithms that solve linear and nonlinear convex optimization problems.
Karmarkar's algorithm is an algorithm introduced by Narendra Karmarkar in 1984 for solving linear programming problems. It was the first reasonably efficient algorithm that solves these problems in polynomial time. The ellipsoid method is also polynomial time but proved to be inefficient in practice.
In mathematical programming and polyhedral combinatorics, the Hirsch conjecture is the statement that the edge-vertex graph of an n-facet polytope in d-dimensional Euclidean space has diameter no more than n − d. That is, any two vertices of the polytope must be connected to each other by a path of length at most n − d. The conjecture was first put forth in a letter by Warren M. Hirsch to George B. Dantzig in 1957 and was motivated by the analysis of the simplex method in linear programming, as the diameter of a polytope provides a lower bound on the number of steps needed by the simplex method. The conjecture is now known to be false in general.
In mathematical optimization, Bland's rule is an algorithmic refinement of the simplex method for linear optimization.
Dantzig–Wolfe decomposition is an algorithm for solving linear programming problems with special structure. It was originally developed by George Dantzig and Philip Wolfe and initially published in 1960. Many texts on linear programming have sections dedicated to discussing this decomposition algorithm.
In theoretical computer science, smoothed analysis is a way of measuring the complexity of an algorithm. Since its introduction in 2001, smoothed analysis has been used as a basis for considerable research, for problems ranging from mathematical programming, numerical analysis, machine learning, and data mining. It can give a more realistic analysis of the practical performance of the algorithm, such as its running time, than using worst-case or average-case scenarios.
Polyhedral combinatorics is a branch of mathematics, within combinatorics and discrete geometry, that studies the problems of counting and describing the faces of convex polyhedra and higher-dimensional convex polytopes.
In mathematical optimization, the criss-cross algorithm is any of a family of algorithms for linear programming. Variants of the criss-cross algorithm also solve more general problems with linear inequality constraints and nonlinear objective functions; there are criss-cross algorithms for linear-fractional programming problems, quadratic-programming problems, and linear complementarity problems.
The Klee–Minty cube or Klee–Minty polytope is a unit hypercube of variable dimension whose corners have been perturbed. Klee and Minty demonstrated that George Dantzig's simplex algorithm has poor worst-case performance when initialized at one corner of their "squashed cube". On the three-dimensional version, the simplex algorithm and the criss-cross algorithm visit all 8 corners in the worst case.
In applied mathematics, Graver bases enable iterative solutions of linear and various nonlinear integer programming problems in polynomial time. They were introduced by Jack E. Graver. Their connection to the theory of Gröbner bases was discussed by Bernd Sturmfels. The algorithmic theory of Graver bases and its application to integer programming is described by Shmuel Onn.
In mathematical optimization, the network simplex algorithm is a graph theoretic specialization of the simplex algorithm. The algorithm is usually formulated in terms of a minimum-cost flow problem. The network simplex method works very well in practice, typically 200 to 300 times faster than the simplex method applied to general linear program of same dimensions.
In mathematical optimization, Cunningham's rule is an algorithmic refinement of the simplex method for linear optimization.
Oliver Friedmann is a German computer scientist and mathematician known for his work on parity games and the simplex algorithm. He is CTO and co-founder of Ziggeo, a cloud-based video technology company.