Kalyanmoy Deb

Last updated

Kalyanmoy Deb
Born
Tripura, India
Academic background
Alma mater IIT Kharagpur, University of Alabama
Thesis Binary and Floating-Point Function Optimization using Messy Genetic Algorithms (1991)
Doctoral advisor David E. Goldberg

Notes

  1. Previous nondominated-sorting genetic algorithms had been introduced by Carlos M. Fonseca and Peter J. Fleming (Genetic algorithms for multiobjective optimization: formulation, discussion and generalization, 1993) and Jeffrey Horn (Northern Michigan University), Nicholas Nafpliotis, and David E. Goldberg (A niched Pareto genetic algorithm for multiobjective optimization, 1994).
  2. Faster compared to Srinivas and Deb's implementation in NSGA (1994).
  3. See also the section Elitist selection in the Selection (genetic algorithm) page.
  4. Incorporating the faster implementation of nondominated sorting and elitist selection made the algorithm faster. Incorporating crowding distance and elitist selection made the algorithm more reliable.
  5. Many-objective optimization is a subfield of multiobjective optimization focusing on problems that have a large number of constraints (four or more constraints).

Related Research Articles

<span class="mw-page-title-main">Genetic algorithm</span> Competitive algorithm for searching a problem space

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, etc.

<span class="mw-page-title-main">Evolutionary algorithm</span> Subset of evolutionary computation

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

<span class="mw-page-title-main">Evolutionary computation</span> Trial and error problem solvers with a metaheuristic or stochastic optimization character

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

<span class="mw-page-title-main">Particle swarm optimization</span> Iterative simulation method

In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formula over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.

A fitness function is a particular type of objective function that is used to summarise, as a single figure of merit, how close a given design solution is to achieving the set aims. Fitness functions are used in evolutionary algorithms (EA), such as genetic programming and genetic algorithms to guide simulations towards optimal design solutions.

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic that may provide a sufficiently good solution to an optimization problem or a machine learning problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems.

<span class="mw-page-title-main">Multiple-criteria decision analysis</span> Operations research that evaluates multiple conflicting criteria in decision making

Multiple-criteria decision-making (MCDM) or multiple-criteria decision analysis (MCDA) is a sub-discipline of operations research that explicitly evaluates multiple conflicting criteria in decision making. Conflicting criteria are typical in evaluating options: cost or price is usually one of the main criteria, and some measure of quality is typically another criterion, easily in conflict with the cost. In purchasing a car, cost, comfort, safety, and fuel economy may be some of the main criteria we consider – it is unusual that the cheapest car is the most comfortable and the safest one. In portfolio management, managers are interested in getting high returns while simultaneously reducing risks; however, the stocks that have the potential of bringing high returns typically carry high risk of losing money. In a service industry, customer satisfaction and the cost of providing service are fundamental conflicting criteria.

Selection is the stage of a genetic algorithm or more general evolutionary algorithm in which individual genomes are chosen from a population for later breeding.

In computer science and operations research, Genetic fuzzy systems are fuzzy systems constructed by using genetic algorithms or genetic programming, which mimic the process of natural evolution, to identify its structure and parameter.

<span class="mw-page-title-main">Pareto front</span> Set of all Pareto efficient situations

In multi-objective optimization, the Pareto front is the set of all Pareto efficient solutions. The concept is widely used in engineering. It allows the designer to restrict attention to the set of efficient choices, and to make tradeoffs within this set, rather than considering the full range of every parameter.

David Edward Goldberg is an American computer scientist, civil engineer, and former professor. Until 2010, he was a professor in the department of Industrial and Enterprise Systems Engineering (IESE) at the University of Illinois at Urbana-Champaign and was noted for his work in the field of genetic algorithms. He was the director of the Illinois Genetic Algorithms Laboratory and the co-founder & chief scientist of Nextumi, which later changed its name to ShareThis. He is the author of Genetic Algorithms in Search, Optimization and Machine Learning, one of the most cited books in computer science.

Multi-objective optimization or Pareto optimization is an area of multiple-criteria decision making that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective is a type of vector optimization that has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives. Minimizing cost while maximizing comfort while buying a car, and maximizing performance whilst minimizing fuel consumption and emission of pollutants of a vehicle are examples of multi-objective optimization problems involving two and three objectives, respectively. In practical problems, there can be more than three objectives.

ParadisEO is a white-box object-oriented framework dedicated to the flexible design of metaheuristics. It uses EO, a template-based, ANSI-C++ compliant computation library. ParadisEO is portable across both Windows system and sequential platforms. ParadisEO is distributed under the CeCill license and can be used under several environments.

In applied mathematics, multimodal optimization deals with optimization tasks that involve finding all or most of the multiple solutions of a problem, as opposed to a single best solution. Evolutionary multimodal optimization is a branch of evolutionary computation, which is closely related to machine learning. Wong provides a short survey, wherein the chapter of Shir and the book of Preuss cover the topic in more detail.

Reward-based selection is a technique used in evolutionary algorithms for selecting potentially useful solutions for recombination. The probability of being selected for an individual is proportional to the cumulative reward obtained by the individual. The cumulative reward can be computed as a sum of the individual reward and the reward, inherited from parents.

MCACEA is a general framework that uses a single evolutionary algorithm (EA) per agent sharing their optimal solutions to coordinate the evolutions of the EAs populations using cooperation objectives. This framework can be used to optimize some characteristics of multiple cooperating agents in mathematical optimization problems. More specifically, due to its nature in which both individual and cooperation objectives are optimize, MCACEA is used in multi-objective optimization problems.

Peter John Fleming is a Professor of Industrial Systems and Control in the Department of Automatic Control and Systems Engineering at the University of Sheffield, and till June 2012 he was the director of the Rolls-Royce University Technology Centre for Control and Systems Engineering. He works in the field of control and systems engineering and is known for his work on evolutionary computation applied to systems engineering. Fleming is Editor-in-Chief of the International Journal of Systems Science.

In applied mathematics, test functions, known as artificial landscapes, are useful to evaluate characteristics of optimization algorithms, such as:

The MOEA Framework is an open-source evolutionary computation library for Java that specializes in multi-objective optimization. It supports a variety of multiobjective evolutionary algorithms (MOEAs), including genetic algorithms, genetic programming, grammatical evolution, differential evolution, and particle swarm optimization. As a result, it has been used to conduct numerous comparative studies to assess the efficiency, reliability, and controllability of state-of-the-art MOEAs.

The Interactive Decision Maps technique of multi-objective optimization is based on approximating the Edgeworth-Pareto Hull (EPH) of the feasible objective set, that is, the feasible objective set broadened by the objective points dominated by it. Alternatively, this set is known as Free Disposal Hull. It is important that the EPH has the same Pareto front as the feasible objective set, but the bi-objective slices of the EPH look much simpler. The frontiers of bi-objective slices of the EPH contain the slices of the Pareto front. It is important that, in contrast to the Pareto front itself, the EPH is usually stable in respect to disturbances of data. The IDM technique applies fast on-line display of bi-objective slices of the EPH approximated in advance.

References

  1. "Kalyanmoy Deb". Honored Faculty - Michigan State University. Retrieved 17 January 2022.
  2. "Kalyanmoy Deb named Koenig endowed chair at Michigan State University". Michigan State University, College of Engineering. Michigan State University. 21 August 2013. Retrieved 17 January 2022.
  3. "Kanpur Genetic Algorithms Laboratory". Kanpur Genetic Algorithms Laboratory. Indian Institute of Technology, Kanpur. 2005. Archived from the original on 30 September 2014. Retrieved 13 August 2015.
  4. "Computational Optimization and Innovation Laboratory (COIN Lab)". Michigan State University, College of Engineering. Archived from the original on 9 March 2015. Retrieved 11 August 2015.
  5. Smith, Alice E. (October 2002). "Book Reviews: Multi-Objective Optimization Using Evolutionary Algorithms" (PDF). IEEE Transactions on Evolutionary Computation. IEEE. 6 (5): 526. doi:10.1109/TEVC.2002.804322. ISSN   1089-778X. S2CID   2867089. Archived (PDF) from the original on 13 August 2015. Retrieved 13 August 2015.
  6. Cotta, Carlos; Merelo, Juan-Julián (3 December 2013). "The Complex Network of Evolutionary Computation Authors: an Initial Study". arXiv: physics/0507196 .
  7. "Infosys Prize - Laureates 2011 - Prof. Kalyanmoy Deb". Infosys Science Foundation. Retrieved 17 January 2022.
  8. "Prizes and Awards". The World Academy of Sciences. 2016.
  9. "Kalyanmoy Deb's Resume" (PDF). Kalyanmoy Deb's webpage. Retrieved 17 January 2022.
  10. Goldberg, David E. "Curriculum Vitae" (PDF). ThreeJoy. Retrieved 17 January 2022.
  11. Deb, Kalyanmoy (1991). Binary and Floating-Point Function Optimization using Messy Genetic Algorithms. ProQuest   303943729 via ProQuest.
  12. Goldberg, David E. (1989). Genetic Algorithms in Search, Optimization & Machine Learning. Addison-Wesley. p. 201. ISBN   0-201-15767-5.
  13. Srinivas, N.; Deb, Kalyanmoy (1994). "Multiobjective optimization using nondominated sorting in genetic algorithms". Evolutionary Computation. 2 (3): 221–248. doi:10.1162/evco.1994.2.3.221. S2CID   13997318.
  14. Seshadri, Aravind. "NSGA - II: A multi-objective optimization algorithm". MathWorks File Exchange. Retrieved 17 January 2022.
  15. Kumar, Nitin; Panwar, Yatish; Mahesh, G. (10 May 2015). "Indian paper crosses 5000+ citations mark" (PDF). Current Science . Current Science Association. 108 (9): 1580. ISSN   0011-3891. Archived (PDF) from the original on 11 August 2015. Retrieved 11 August 2015.
  16. Mudur, G.S. (11 May 2015). "6000-citation feat by 4 Indian researchers". The Telegraph . Calcutta, India. Archived from the original on 22 July 2015. Retrieved 11 August 2015.
  17. Deb, Kalyanmoy; Jain, Himanshu (2013). "An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints". IEEE Transactions on Evolutionary Computation. 18 (4): 577–601. doi:10.1109/TEVC.2013.2281535. S2CID   206682597.
  18. "Global computing association names 57 fellows for outstanding contributions that propel technology today". Association for Computing Machinery. 18 January 2023. Retrieved 18 January 2023.