This article needs additional citations for verification .(February 2024) |
The IEEE Congress on Evolutionary Computation (IEEE CEC) is one of the largest conferences within evolutionary computation (EC), the other conferences of similar importance being Genetic and Evolutionary Computation Conference (GECCO), Parallel Problem Solving from Nature (PPSN) and EvoStar (which comprises EuroGP, EvoApplications, EvoCOP, and EvoMUSART).
IEEE CEC, which is organized by the IEEE Computational Intelligence Society began in 1994 as the IEEE Symposium on Evolutionary Computation held in Orlando, Florida. In 1995 the conference name changed to IEEE International Conference on Evolutionary Computation (IEEE ICEC) and it was held with this title through 1998. In 1999, the IEEE Computational Intelligence Society, in combination with the Evolutionary Programming Society which operated the annual Evolutionary Programming Conference (1992-1999) and the IET (IEE) which operated the International Conference on Genetic Algorithms in Engineering Systems, Innovations and Applications (1995-1999) combined to co-sponsor the newly named IEEE Congress on Evolutionary Computation, and this title continues to the present. [1] In even years it is associated with the IEEE World Congress on Computational Intelligence (IEEE WCCI), the flagship conference of the IEEE Computational Intelligence Society. The conference covers most subtopics of EC, such as evolutionary robotics, multiobjective optimization, evolvable hardware, theory of evolutionary computation, and evolutionary design. Papers can also be found that deal with topics that are related to rather than part of EC, such ant colony optimization, swarm intelligence, and quantum computing.
The conference usually attracts several hundreds of attendees, as well as hundreds of papers. [2]
In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.
In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.
In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formulae over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.
In computer programming, genetic representation is a way of presenting solutions/individuals in evolutionary computation methods. The term encompasses both the concrete data structures and data types used to realize the genetic material of the candidate solutions in the form of a genome, and the relationships between search space and problem space. In the simplest case, the search space corresponds to the problem space. The choice of problem representation is tied to the choice of genetic operators, both of which have a decisive effect on the efficiency of the optimization. Genetic representation can encode appearance, behavior, physical qualities of individuals. Difference in genetic representations is one of the major criteria drawing a line between known classes of evolutionary computation.
A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.
David B. Fogel is a pioneer in evolutionary computation.
Search-based software engineering (SBSE) applies metaheuristic search techniques such as genetic algorithms, simulated annealing and tabu search to software engineering problems. Many activities in software engineering can be stated as optimization problems. Optimization techniques of operations research such as linear programming or dynamic programming are often impractical for large scale software engineering problems because of their computational complexity or their assumptions on the problem structure. Researchers and practitioners use metaheuristic search techniques, which impose little assumptions on the problem structure, to find near-optimal or "good-enough" solutions.
Evolutionary acquisition of neural topologies (EANT/EANT2) is an evolutionary reinforcement learning method that evolves both the topology and weights of artificial neural networks. It is closely related to the works of Angeline et al. and Stanley and Miikkulainen. Like the work of Angeline et al., the method uses a type of parametric mutation that comes from evolution strategies and evolutionary programming, in which adaptive step sizes are used for optimizing the weights of the neural networks. Similar to the work of Stanley (NEAT), the method starts with minimal structures which gain complexity along the evolution path.
A hyper-heuristic is a heuristic search method that seeks to automate, often by the incorporation of machine learning techniques, the process of selecting, combining, generating or adapting several simpler heuristics to efficiently solve computational search problems. One of the motivations for studying hyper-heuristics is to build systems which can handle classes of problems rather than solving just one problem.
In numerical optimization, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal parameter settings of a genetic algorithm.
Enrique Alba is a professor of computer science at the University of Málaga, Spain.
In applied mathematics, test functions, known as artificial landscapes, are useful to evaluate characteristics of optimization algorithms, such as:
The Genetic and Evolutionary Computation Conference (GECCO) is the premier conference in the area of genetic and evolutionary computation. GECCO has been held every year since 1999, when it was first established as a recombination of the International Conference on Genetic Algorithms (ICGA) and the Annual Genetic Programming Conference (GP).
Yuhui Shi is a pioneer in particle swarm optimization algorithms and the developer of brain storm optimization algorithms. He was an electrical engineer from Xi'an Jiaotong-Liverpool University in Suzhou, China, where he was named a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in 2016 for his contributions to particle swarm optimization algorithms. He earned his PhD in electrical engineering from Southeast University, Nanjing, China in 1992, and was trained as a Post Doc Fellow at Concordia University under Canadian International Development Agency joint doctoral program, initiated by Prof. Jeremiah F. Hayes et al. He organized the first IEEE Symposium on Swarm Intelligence in 2003, and established the IEEE CIS Task Force on Swarm Intelligence in 2002, when he co-authored a book with James Kennedy and Russell C. Eberhart. He is a Chair Professor in the Department of Computer Science and Engineering, Southern University of Science and Technology (SUSTech), Shenzhen, China, where he invited Prof. Jun (Steed) Huang, from the Joint Institutes of Carleton University and the University of Ottawa, for a collaboration on swarm intelligence robotics.
Professor Emma Hart, FRSE is an English computer scientist known for her work in artificial immune systems (AIS), evolutionary computation and optimisation. She is a professor of computational intelligence at Edinburgh Napier University, editor-in-chief of the Journal of Evolutionary Computation, and D. Coordinator of the Future & Emerging Technologies (FET) Proactive Initiative, Fundamentals of Collective Adaptive Systems.
EvoStar, or Evo*, is an international scientific event devoted to evolutionary computation held in Europe. Its structure has evolved over time and it currently comprises four conferences: EuroGP the annual conference on Genetic Programming, EvoApplications, the International Conference on the Applications of Evolutionary Computation, EvoCOP, European Conference on Evolutionary Computation in Combinatorial Optimisation, and EvoMUSART, the International Conference on Computational Intelligence in Music, Sound, Art and Design. According to a 2016 study EvoApplications is a Q1 conference, while EuroGP and EvoCOP are both Q2. In 2021, EuroGP, EvoApplications and EvoCOP obtained a CORE rank B.
Multi-task optimization is a paradigm in the optimization literature that focuses on solving multiple self-contained tasks simultaneously. The paradigm has been inspired by the well-established concepts of transfer learning and multi-task learning in predictive analytics.
This is a chronological table of metaheuristic algorithms that only contains fundamental computational intelligence algorithms. Hybrid algorithms and multi-objective algorithms are not listed in the table below.