Test functions for optimization

Last updated

In applied mathematics, test functions, known as artificial landscapes, are useful to evaluate characteristics of optimization algorithms, such as:

Contents


Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective Pareto fronts for multi-objective optimization problems (MOP) are given.

The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck, [1] Haupt et al. [2] and from Rody Oldenhuis software. [3] Given the number of problems (55 in total), just a few are presented here.

The test functions used to evaluate the algorithms for MOP were taken from Deb, [4] Binh et al. [5] and Binh. [6] The software developed by Deb can be downloaded, [7] which implements the NSGA-II procedure with GAs, or the program posted on Internet, [8] which implements the NSGA-II procedure with ES.

Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.

Test functions for single-objective optimization

NamePlotFormulaGlobal minimumSearch domain
Rastrigin function Rastrigin contour plot.svg

Ackley function Ackley contour function.svg

Sphere function Sphere contour.svg ,
Rosenbrock function Rosenbrock contour.svg ,
Beale function Beale contour.svg

Goldstein–Price function Goldstein-Price contour.svg

Booth function Booth contour.svg
Bukin function N.6 Bukin 6 contour.svg ,
Matyas function Matyas contour.svg
Lévi function N.13 Levi13 contour.svg

Himmelblau's function Himmelblau contour plot.svg
Three-hump camel function Three-hump-camel contour.svg
Easom function Easom contour.svg
Cross-in-tray function Cross-in-tray contour.svg
Eggholder function [9] [10] Eggholder contour.svg
Hölder table function Hoelder table contour.svg
McCormick function McCormick contour.svg ,
Schaffer function N. 2 Schaffer2 contour.svg
Schaffer function N. 4 Schaffer4 contour.svg
Styblinski–Tang function Styblinski-Tang contour.svg , ..
Shekel function Shekel 2D.jpg

or, similarly,

,

Test functions for constrained optimization

NamePlotFormulaGlobal minimumSearch domain
Rosenbrock function constrained with a cubic and a line [11] Rosenbrock cubic constraint.svg ,

subjected to:

,
Rosenbrock function constrained to a disk [12] Rosenbrock circle constraint.svg ,

subjected to:

,
Mishra's Bird function - constrained [13] [14] Mishra bird contour.svg ,

subjected to:

,
Townsend function (modified) [15] Townsend contour.svg ,

subjected to: where: t = Atan2(x,y)

,
Gomez and Levy function (modified) [16] Gomez-Levi contour.svg ,

subjected to:

,
Simionescu function [17] Simionescu contour.svg ,

subjected to:

Test functions for multi-objective optimization

[ further explanation needed ]

NamePlotFunctionsConstraintsSearch domain
Binh and Korn function: [5] Binh and Korn function.pdf ,
Chankong and Haimes function: [18] Chakong and Haimes function.pdf
Fonseca–Fleming function: [19] Fonseca and Fleming function.pdf ,
Test function 4: [6] Test function 4 - Binh.pdf
Kursawe function: [20] Kursawe function.pdf , .
Schaffer function N. 1: [21] Schaffer function 1.pdf . Values of from to have been used successfully. Higher values of increase the difficulty of the problem.
Schaffer function N. 2: Schaffer function 2 - multi-objective.pdf .
Poloni's two objective function: Poloni's two objective function.pdf

Zitzler–Deb–Thiele's function N. 1: [22] Zitzler-Deb-Thiele's function 1.pdf , .
Zitzler–Deb–Thiele's function N. 2: [22] Zitzler-Deb-Thiele's function 2.pdf , .
Zitzler–Deb–Thiele's function N. 3: [22] Zitzler-Deb-Thiele's function 3.pdf , .
Zitzler–Deb–Thiele's function N. 4: [22] Zitzler-Deb-Thiele's function 4.pdf , ,
Zitzler–Deb–Thiele's function N. 6: [22] Zitzler-Deb-Thiele's function 6.pdf , .
Osyczka and Kundu function: [23] Osyczka and Kundu function.pdf , , .
CTP1 function (2 variables): [4] [24] CTP1 function (2 variables).pdf .
Constr-Ex problem: [4] Constr-Ex problem.pdf ,
Viennet function: Viennet function.pdf .

Related Research Articles

<span class="mw-page-title-main">Genetic algorithm</span> Competitive algorithm for searching a problem space

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, causal inference, etc.

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

<span class="mw-page-title-main">Particle swarm optimization</span> Iterative simulation method

In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formula over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.

A fitness function is a particular type of objective function that is used to summarise, as a single figure of merit, how close a given design solution is to achieving the set aims. Fitness functions are used in evolutionary algorithms (EA), such as genetic programming and genetic algorithms to guide simulations towards optimal design solutions.

Multi-disciplinary design optimization (MDO) is a field of engineering that uses optimization methods to solve design problems incorporating a number of disciplines. It is also known as multidisciplinary system design optimization (MSDO), and Multidisciplinary Design Analysis and Optimization (MDAO).

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic that may provide a sufficiently good solution to an optimization problem or a machine learning problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems.

<span class="mw-page-title-main">Multiple-criteria decision analysis</span> Operations research that evaluates multiple conflicting criteria in decision making

Multiple-criteria decision-making (MCDM) or multiple-criteria decision analysis (MCDA) is a sub-discipline of operations research that explicitly evaluates multiple conflicting criteria in decision making. Conflicting criteria are typical in evaluating options: cost or price is usually one of the main criteria, and some measure of quality is typically another criterion, easily in conflict with the cost. In purchasing a car, cost, comfort, safety, and fuel economy may be some of the main criteria we consider – it is unusual that the cheapest car is the most comfortable and the safest one. In portfolio management, managers are interested in getting high returns while simultaneously reducing risks; however, the stocks that have the potential of bringing high returns typically carry high risk of losing money. In a service industry, customer satisfaction and the cost of providing service are fundamental conflicting criteria.

In computer science and operations research, Genetic fuzzy systems are fuzzy systems constructed by using genetic algorithms or genetic programming, which mimic the process of natural evolution, to identify its structure and parameter.

<span class="mw-page-title-main">Pareto front</span> Set of all Pareto efficient situations

In multi-objective optimization, the Pareto front is the set of all Pareto efficient solutions. The concept is widely used in engineering. It allows the designer to restrict attention to the set of efficient choices, and to make tradeoffs within this set, rather than considering the full range of every parameter.

A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.

Kalyanmoy Deb is an Indian computer scientist. Deb is the Herman E. & Ruth J. Koenig Endowed Chair Professor in the Department of Electrical and Computing Engineering at Michigan State University. Deb is also a professor in the Department of Computer Science and Engineering and the Department of Mechanical Engineering at Michigan State University.

Multi-objective optimization or Pareto optimization is an area of multiple-criteria decision making that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective is a type of vector optimization that has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives. Minimizing cost while maximizing comfort while buying a car, and maximizing performance whilst minimizing fuel consumption and emission of pollutants of a vehicle are examples of multi-objective optimization problems involving two and three objectives, respectively. In practical problems, there can be more than three objectives.

ECJ is a freeware evolutionary computation research system written in Java. It is a framework that supports a variety of evolutionary computation techniques, such as genetic algorithms, genetic programming, evolution strategies, coevolution, particle swarm optimization, and differential evolution. The framework models iterative evolutionary processes using a series of pipelines arranged to connect one or more subpopulations of individuals with selection, breeding (such as crossover, and mutation operators that produce new individuals. The framework is open source and is distributed under the Academic Free License. ECJ was created by Sean Luke, a computer science professor at George Mason University, and is maintained by Sean Luke and a variety of contributors.

ParadisEO is a white-box object-oriented framework dedicated to the flexible design of metaheuristics. It uses EO, a template-based, ANSI-C++ compliant computation library. ParadisEO is portable across both Windows system and sequential platforms. ParadisEO is distributed under the CeCill license and can be used under several environments.

In applied mathematics, multimodal optimization deals with optimization tasks that involve finding all or most of the multiple solutions of a problem, as opposed to a single best solution. Evolutionary multimodal optimization is a branch of evolutionary computation, which is closely related to machine learning. Wong provides a short survey, wherein the chapter of Shir and the book of Preuss cover the topic in more detail.

MCACEA is a general framework that uses a single evolutionary algorithm (EA) per agent sharing their optimal solutions to coordinate the evolutions of the EAs populations using cooperation objectives. This framework can be used to optimize some characteristics of multiple cooperating agents in mathematical optimization problems. More specifically, due to its nature in which both individual and cooperation objectives are optimize, MCACEA is used in multi-objective optimization problems.

The MOEA Framework is an open-source evolutionary computation library for Java that specializes in multi-objective optimization. It supports a variety of multiobjective evolutionary algorithms (MOEAs), including genetic algorithms, genetic programming, grammatical evolution, differential evolution, and particle swarm optimization. As a result, it has been used to conduct numerous comparative studies to assess the efficiency, reliability, and controllability of state-of-the-art MOEAs.

Derivative-free optimization, is a discipline in mathematical optimization that does not use derivative information in the classical sense to find optimal solutions: Sometimes information about the derivative of the objective function f is unavailable, unreliable or impractical to obtain. For example, f might be non-smooth, or time-consuming to evaluate, or in some way noisy, so that methods that rely on derivatives or approximate them via finite differences are of little use. The problem to find optimal points in such situations is referred to as derivative-free optimization, algorithms that do not use derivatives or finite differences are called derivative-free algorithms.

References

  1. Bäck, Thomas (1995). Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms. Oxford: Oxford University Press. p. 328. ISBN   978-0-19-509971-3.
  2. Haupt, Randy L. Haupt, Sue Ellen (2004). Practical genetic algorithms with CD-Rom (2nd ed.). New York: J. Wiley. ISBN   978-0-471-45565-3.{{cite book}}: CS1 maint: multiple names: authors list (link)
  3. Oldenhuis, Rody. "Many test functions for global optimizers". Mathworks. Retrieved 1 November 2012.
  4. 1 2 3 Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley. ISBN   0-471-87339-X.
  5. 1 2 Binh T. and Korn U. (1997) MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182
  6. 1 2 Binh T. (1999) A multiobjective evolutionary algorithm. The study cases. Technical report. Institute for Automation and Communication. Barleben, Germany
  7. Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL: https://www.iitk.ac.in/kangal/codes.shtml
  8. Ortiz, Gilberto A. "Multi-objective optimization using ES as Evolutionary Algorithm". Mathworks. Retrieved 1 November 2012.
  9. Whitley, Darrell; Rana, Soraya; Dzubera, John; Mathias, Keith E. (1996). "Evaluating evolutionary algorithms". Artificial Intelligence. Elsevier BV. 85 (1–2): 264. doi: 10.1016/0004-3702(95)00124-7 . ISSN   0004-3702.
  10. Vanaret C. (2015) Hybridization of interval methods and evolutionary algorithms for solving difficult optimization problems. PhD thesis. Ecole Nationale de l'Aviation Civile. Institut National Polytechnique de Toulouse, France.
  11. Simionescu, P.A.; Beale, D. (September 29 – October 2, 2002). New Concepts in Graphic Visualization of Objective Functions (PDF). ASME 2002 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. Montreal, Canada. pp. 891–897. Retrieved 7 January 2017.
  12. "Solve a Constrained Nonlinear Problem - MATLAB & Simulink". www.mathworks.com. Retrieved 2017-08-29.
  13. "Bird Problem (Constrained) | Phoenix Integration". Archived from the original on 2016-12-29. Retrieved 2017-08-29.{{cite web}}: CS1 maint: bot: original URL status unknown (link)
  14. Mishra, Sudhanshu (2006). "Some new test functions for global optimization and performance of repulsive particle swarm method". MPRA Paper.
  15. Townsend, Alex (January 2014). "Constrained optimization in Chebfun". chebfun.org. Retrieved 2017-08-29.
  16. Simionescu, P.A. (2020). "A collection of bivariate nonlinear optimisation test problems with graphical representations". International Journal of Mathematical Modelling and Numerical Optimisation. 10 (4): 365–398. doi:10.1504/IJMMNO.2020.110704.
  17. Simionescu, P.A. (2014). Computer Aided Graphing and Simulation Tools for AutoCAD Users (1st ed.). Boca Raton, FL: CRC Press. ISBN   978-1-4822-5290-3.
  18. Chankong, Vira; Haimes, Yacov Y. (1983). Multiobjective decision making. Theory and methodology. North Holland. ISBN   0-444-00710-5.
  19. Fonseca, C. M.; Fleming, P. J. (1995). "An Overview of Evolutionary Algorithms in Multiobjective Optimization". Evol Comput . 3 (1): 1–16. CiteSeerX   10.1.1.50.7779 . doi:10.1162/evco.1995.3.1.1. S2CID   8530790.
  20. F. Kursawe, “A variant of evolution strategies for vector optimization,” in PPSN I, Vol 496 Lect Notes in Comput Sc. Springer-Verlag, 1991, pp. 193–197.
  21. Schaffer, J. David (1984). "Multiple Objective Optimization with Vector Evaluated Genetic Algorithms". In G.J.E Grefensette; J.J. Lawrence Erlbraum (eds.). Proceedings of the First International Conference on Genetic Algorithms. OCLC   20004572.
  22. 1 2 3 4 5 Deb, Kalyan; Thiele, L.; Laumanns, Marco; Zitzler, Eckart (2002). "Scalable multi-objective optimization test problems". Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600). Vol. 1. pp. 825–830. doi:10.1109/CEC.2002.1007032. ISBN   0-7803-7282-4. S2CID   61001583.
  23. Osyczka, A.; Kundu, S. (1 October 1995). "A new method to solve generalized multicriteria optimization problems using the simple genetic algorithm". Structural Optimization. 10 (2): 94–99. doi:10.1007/BF01743536. ISSN   1615-1488. S2CID   123433499.
  24. Jimenez, F.; Gomez-Skarmeta, A. F.; Sanchez, G.; Deb, K. (May 2002). "An evolutionary algorithm for constrained multi-objective optimization". Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600). Vol. 2. pp. 1133–1138. doi:10.1109/CEC.2002.1004402. ISBN   0-7803-7282-4. S2CID   56563996.