Evolutionary computation

Last updated
Evolution of a population of random patches using fitness based on the error difference with an image of Charles Darwin Darwin image evolution from random patches.gif
Evolution of a population of random patches using fitness based on the error difference with an image of Charles Darwin

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

Contents

In evolutionary computation, an initial set of candidate solutions is generated and iteratively updated. Each new generation is produced by stochastically removing less desired solutions, and introducing small random changes as well as, depending on the method, mixing parental information. In biological terminology, a population of solutions is subjected to natural selection (or artificial selection), mutation and possibly recombination. As a result, the population will gradually evolve to increase in fitness, in this case the chosen fitness function of the algorithm.

Evolutionary computation techniques can produce highly optimized solutions in a wide range of problem settings, making them popular in computer science. Many variants and extensions exist, suited to more specific families of problems and data structures. Evolutionary computation is also sometimes used in evolutionary biology as an in silico experimental procedure to study common aspects of general evolutionary processes.

History

The concept of mimicking evolutionary processes to solve problems originates before the advent of computers, such as when Alan Turing proposed a method of genetic search in 1948 . [1] Turing's B-type u-machines resemble primitive neural networks, and connections between neurons were learnt via a sort of genetic algorithm. His P-type u-machines resemble a method for reinforcement learning, where pleasure and pain signals direct the machine to learn certain behaviors. However, Turing's paper went unpublished until 1968, and he died in 1954, so this early work had little to no effect on the field of evolutionary computation that was to develop. [2]

Evolutionary computing as a field began in earnest in the 1950s and 1960s. [1] There were several independent attempts to use the process of evolution in computing at this time, which developed separately for roughly 15 years. Three branches emerged in different places to attain this goal: evolution strategies, evolutionary programming, and genetic algorithms. A fourth branch, genetic programming, eventually emerged in the early 1990s. These approaches differ in the method of selection, the permitted mutations, and the representation of genetic data. By the 1990s, the distinctions between the historic branches had begun to blur, and the term 'evolutionary computing' was coined in 1991 to denote a field that exists over all four paradigms. [3]

In 1962, Lawrence J. Fogel initiated the research of Evolutionary Programming in the United States, which was considered an artificial intelligence endeavor. In this system, finite state machines are used to solve a prediction problem: these machines would be mutated (adding or deleting states, or changing the state transition rules), and the best of these mutated machines would be evolved further in future generations. The final finite state machine may be used to generate predictions when needed. The evolutionary programming method was successfully applied to prediction problems, system identification, and automatic control. It was eventually extended to handle time series data and to model the evolution of gaming strategies. [3]

In 1964, Ingo Rechenberg and Hans-Paul Schwefel introduce the paradigm of evolution strategies in Germany. [3] Since traditional gradient descent techniques produce results that may get stuck in local minima, Rechenberg and Schwefel proposed that random mutations (applied to all parameters of some solution vector) may be used to escape these minima. Child solutions were generated from parent solutions, and the more successful of the two was kept for future generations. This technique was first used by the two to successfully solve optimization problems in fluid dynamics. [4] Initially, this optimization technique was performed without computers, instead relying on dice to determine random mutations. By 1965, the calculations were performed wholly by machine. [3]

John Henry Holland introduced genetic algorithms in the 1960s, and it was further developed at the University of Michigan in the 1970s. [5] While the other approaches were focused on solving problems, Holland primarily aimed to use genetic algorithms to study adaptation and determine how it may be simulated. Populations of chromosomes, represented as bit strings, were transformed by an artificial selection process, selecting for specific 'allele' bits in the bit string. Among other mutation methods, interactions between chromosomes were used to simulate the recombination of DNA between different organisms. While previous methods only tracked a single optimal organism at a time (having children compete with parents), Holland's genetic algorithms tracked large populations (having many organisms compete each generation).

By the 1990s, a new approach to evolutionary computation that came to be called genetic programming emerged, advocated for by John Koza among others. [3] In this class of algorithms, the subject of evolution was itself a program written in a high-level programming language (there had been some previous attempts as early as 1958 to use machine code, but they met with little success). For Koza, the programs were Lisp S-expressions, which can be thought of as trees of sub-expressions. This representation permits programs to swap subtrees, representing a sort of genetic mixing. Programs are scored based on how well they complete a certain task, and the score is used for artificial selection. Sequence induction, pattern recognition, and planning were all successful applications of the genetic programming paradigm.

Many other figures played a role in the history of evolutionary computing, although their work did not always fit into one of the major historical branches of the field. The earliest computational simulations of evolution using evolutionary algorithms and artificial life techniques were performed by Nils Aall Barricelli in 1953, with first results published in 1954. [6] Another pioneer in the 1950s was Alex Fraser, who published a series of papers on simulation of artificial selection. [7] As academic interest grew, dramatic increases in the power of computers allowed practical applications, including the automatic evolution of computer programs. [8] Evolutionary algorithms are now used to solve multi-dimensional problems more efficiently than software produced by human designers, and also to optimize the design of systems. [9] [10]

Techniques

Evolutionary computing techniques mostly involve metaheuristic optimization algorithms. Broadly speaking, the field includes:

A through catalogue with many other recently proposed algorithms has been published in the Evolutionary Computation Bestiary. [11] It is important to note that many recent algorithms, however, have poor experimental validation. [12]

Evolutionary algorithms

Evolutionary algorithms form a subset of evolutionary computation in that they generally only involve techniques implementing mechanisms inspired by biological evolution such as reproduction, mutation, recombination, natural selection and survival of the fittest. Candidate solutions to the optimization problem play the role of individuals in a population, and the cost function determines the environment within which the solutions "live" (see also fitness function). Evolution of the population then takes place after the repeated application of the above operators.

In this process, there are two main forces that form the basis of evolutionary systems: Recombination (e.g. crossover) and mutation create the necessary diversity and thereby facilitate novelty, while selection acts as a force increasing quality.

Many aspects of such an evolutionary process are stochastic. Changed pieces of information due to recombination and mutation are randomly chosen. On the other hand, selection operators can be either deterministic, or stochastic. In the latter case, individuals with a higher fitness have a higher chance to be selected than individuals with a lower fitness, but typically even the weak individuals have a chance to become a parent or to survive.

Evolutionary algorithms and biology

Genetic algorithms deliver methods to model biological systems and systems biology that are linked to the theory of dynamical systems, since they are used to predict the future states of the system. This is just a vivid (but perhaps misleading) way of drawing attention to the orderly, well-controlled and highly structured character of development in biology.

However, the use of algorithms and informatics, in particular of computational theory, beyond the analogy to dynamical systems, is also relevant to understand evolution itself.

This view has the merit of recognizing that there is no central control of development; organisms develop as a result of local interactions within and between cells. The most promising ideas about program-development parallels seem to us to be ones that point to an apparently close analogy between processes within cells, and the low-level operation of modern computers. [13] Thus, biological systems are like computational machines that process input information to compute next states, such that biological systems are closer to a computation than classical dynamical system. [14]

Furthermore, following concepts from computational theory, micro processes in biological organisms are fundamentally incomplete and undecidable (completeness (logic)), implying that “there is more than a crude metaphor behind the analogy between cells and computers. [15]

The analogy to computation extends also to the relationship between inheritance systems and biological structure, which is often thought to reveal one of the most pressing problems in explaining the origins of life.

Evolutionary automata [16] [17] [18] , a generalization of Evolutionary Turing machines [19] [20] , have been introduced in order to investigate more precisely properties of biological and evolutionary computation. In particular, they allow to obtain new results on expressiveness of evolutionary computation [18] [21] . This confirms the initial result about undecidability of natural evolution and evolutionary algorithms and processes. Evolutionary finite automata, the simplest subclass of Evolutionary automata working in terminal mode can accept arbitrary languages over a given alphabet, including non-recursively enumerable (e.g., diagonalization language) and recursively enumerable but not recursive languages (e.g., language of the universal Turing machine) [22] .

Notable practitioners

The list of active researchers is naturally dynamic and non-exhaustive. A network analysis of the community was published in 2007. [23]

Journals

While articles on or using evolutionary computation permeate the literature, several journals are dedicated to evolutionary computation:

Conferences

The main conferences in the evolutionary computation area include

See also

Bibliography

Related Research Articles

In artificial intelligence, genetic programming (GP) is a technique of evolving programs, starting from a population of unfit programs, fit for a particular task by applying operations analogous to natural genetic processes to the population of programs.

<span class="mw-page-title-main">Genetic algorithm</span> Competitive algorithm for searching a problem space

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, causal inference, etc.

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

Evolutionary programming is one of the four major evolutionary algorithm paradigms. It is similar to genetic programming, but the structure of the program to be optimized is fixed, while its numerical parameters are allowed to evolve.

In genetic algorithms and evolutionary computation, crossover, also called recombination, is a genetic operator used to combine the genetic information of two parents to generate new offspring. It is one way to stochastically generate new solutions from an existing population, and is analogous to the crossover that happens during sexual reproduction in biology. Solutions can also be generated by cloning an existing solution, which is analogous to asexual reproduction. Newly generated solutions may be mutated before being added to the population.

Mutation is a genetic operator used to maintain genetic diversity of the chromosomes of a population of a genetic or, more generally, an evolutionary algorithm (EA). It is analogous to biological mutation.

In genetic algorithms (GA), or more general, evolutionary algorithms (EA), a chromosome is a set of parameters which define a proposed solution of the problem that the evolutionary algorithm is trying to solve. The set of all solutions, also called individuals according to the biological model, is known as the population. The genome of an individual consists of one, more rarely of several, chromosomes and corresponds to the genetic representation of the task to be solved. A chromosome is composed of a set of genes, where a gene consists of one or more semantically connected parameters, which are often also called decision variables. They determine one or more phenotypic characteristics of the individual or at least have an influence on them. In the basic form of genetic algorithms, the chromosome is represented as a binary string, while in later variants and in EAs in general, a wide variety of other data structures are used.

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic that may provide a sufficiently good solution to an optimization problem or a machine learning problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems.

In computer science, an evolution strategy (ES) is an optimization technique based on ideas of evolution. It belongs to the general class of evolutionary computation or artificial evolution methodologies.

In natural evolution and artificial evolution the fitness of a schema is rescaled to give its effective fitness which takes into account crossover and mutation.

In evolutionary algorithms (EA), the term of premature convergence means that a population for an optimization problem converged too early, resulting in being suboptimal. In this context, the parental solutions, through the aid of genetic operators, are not able to generate offspring that are superior to, or outperform, their parents. Premature convergence is a common problem found in evolutionary algorithms in general and genetic algorithms in particular, as it leads to a loss, or convergence of, a large number of alleles, subsequently making it very difficult to search for a specific gene in which the alleles were present. An allele is considered lost if, in a population, a gene is present, where all individuals are sharing the same value for that particular gene. An allele is, as defined by De Jong, considered to be a converged allele, when 95% of a population share the same value for a certain gene.

In computer programming, genetic representation is a way of presenting solutions/individuals in evolutionary computation methods. The term encompasses both the concrete data structures and data types used to realize the genetic material of the candidate solutions in the form of a genome, and the relationships between search space and problem space. In the simplest case, the search space corresponds to the problem space. The choice of problem representation is tied to the choice of genetic operators, both of which have a decisive effect on the efficiency of the optimization. Genetic representation can encode appearance, behavior, physical qualities of individuals. Difference in genetic representations is one of the major criteria drawing a line between known classes of evolutionary computation.

Human-based computation (HBC), human-assisted computation, ubiquitous human computing or distributed thinking is a computer science technique in which a machine performs its function by outsourcing certain steps to humans, usually as microwork. This approach uses differences in abilities and alternative costs between humans and computer agents to achieve symbiotic human–computer interaction. For computationally difficult tasks such as image recognition, human-based computation plays a central role in training Deep Learning-based Artificial Intelligence systems. In this case, human-based computation has been referred to as human-aided artificial intelligence.

Lateral computing is a lateral thinking approach to solving computing problems. Lateral thinking has been made popular by Edward de Bono. This thinking technique is applied to generate creative ideas and solve problems. Similarly, by applying lateral-computing techniques to a problem, it can become much easier to arrive at a computationally inexpensive, easy to implement, efficient, innovative or unconventional solution.

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

<span class="mw-page-title-main">Artificial life</span> Field of study

Artificial life is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry. The discipline was named by Christopher Langton, an American theoretical biologist, in 1986. In 1987 Langton organized the first conference on the field, in Los Alamos, New Mexico. There are three main kinds of alife, named for their approaches: soft, from software; hard, from hardware; and wet, from biochemistry. Artificial life researchers study traditional biology by trying to recreate aspects of biological phenomena.

Machine learning control (MLC) is a subfield of machine learning, intelligent control and control theory which solves optimal control problems with methods of machine learning. Key applications are complex nonlinear systems for which linear control theory methods are not applicable.

Soft computing is an umbrella term used to describe types of algorithms that produce approximate solutions to unsolvable high-level problems in computer science. Typically, traditional hard-computing algorithms heavily rely on concrete data and mathematical models to produce solutions to problems. Soft computing was coined in the late 20th century. During this period, revolutionary research in three fields greatly impacted soft computing. Fuzzy logic is a computational paradigm that entertains the uncertainties in data by using levels of truth rather than rigid 0s and 1s in binary. Next, neural networks which are computational models influenced by human brain functions. Finally, evolutionary computation is a term to describe groups of algorithm that mimic natural processes such as evolution and natural selection.

References

  1. 1 2 Eiben, A. E.; Smith, J. E. (2015), Evolutionary Computing: The Origins, Natural Computing Series, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 13–24, doi:10.1007/978-3-662-44874-8_2, ISBN   978-3-662-44873-1 , retrieved May 6, 2022
  2. Burgin, Mark; Eberbach, Eugene (April 12, 2013). "Evolutionary Turing in the Context of Evolutionary Machines". arXiv: 1304.3762 [cs.AI].
  3. 1 2 3 4 5 Evolutionary computation : the fossil record. David B. Fogel. New York: IEEE Press. 1998. ISBN   0-7803-3481-7. OCLC   38270557.{{cite book}}: CS1 maint: others (link)
  4. Fischer, Thomas (1986), "Kybernetische Systemanalyse Einer Tuchfabrik zur Einführung Eines Computergestützten Dispositionssystems der Fertigung", DGOR, Berlin, Heidelberg: Springer Berlin Heidelberg, p. 120, doi:10.1007/978-3-642-71161-9_14, ISBN   978-3-642-71162-6 , retrieved May 6, 2022
  5. Mitchell, Melanie (1998). An Introduction to Genetic Algorithms. The MIT Press. doi:10.7551/mitpress/3927.001.0001. ISBN   978-0-262-28001-3.
  6. Barricelli, Nils Aall (1954). "Esempi Numerici di processi di evoluzione". Methodos: 45–68.
  7. Fraser AS (1958). "Monte Carlo analyses of genetic models". Nature. 181 (4603): 208–9. Bibcode:1958Natur.181..208F. doi:10.1038/181208a0. PMID   13504138. S2CID   4211563.
  8. Koza, John R. (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press. ISBN   978-0-262-11170-6.
  9. G. C. Onwubolu and B V Babu, Onwubolu, Godfrey C.; Babu, B. V. (January 21, 2004). New Optimization Techniques in Engineering. Springer. ISBN   9783540201670 . Retrieved September 17, 2016.
  10. Jamshidi M (2003). "Tools for intelligent control: fuzzy controllers, neural networks and genetic algorithms". Philosophical Transactions of the Royal Society A . 361 (1809): 1781–808. Bibcode:2003RSPTA.361.1781J. doi:10.1098/rsta.2003.1225. PMID   12952685. S2CID   34259612.
  11. Campelo, Felipe; Aranha, Claus (June 20, 2018). "Ec Bestiary: A Bestiary Of Evolutionary, Swarm And Other Metaphor-Based Algorithms". doi:10.5281/ZENODO.1293035.{{cite journal}}: Cite journal requires |journal= (help)
  12. Kudela, Jakub (December 12, 2022). "A critical problem in benchmarking and analysis of evolutionary computation methods". Nature Machine Intelligence. 4 (12): 1238–1245. arXiv: 2301.01984 . doi:10.1038/s42256-022-00579-0. ISSN   2522-5839. S2CID   254616518.
  13. "Biological Information". The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. 2016.
  14. J.G. Diaz Ochoa (2018). "Elastic Multi-scale Mechanisms: Computation and Biological Evolution". Journal of Molecular Evolution . 86 (1): 47–57. Bibcode:2018JMolE..86...47D. doi:10.1007/s00239-017-9823-7. PMID   29248946. S2CID   22624633.
  15. A. Danchin (2008). "Bacteria as computers making computers". FEMS Microbiol. Rev. 33 (1): 3–26. doi:10.1111/j.1574-6976.2008.00137.x. PMC   2704931 . PMID   19016882.
  16. Burgin, Mark; Eberbach, Eugene (2013). "Recursively Generated Evolutionary Turing Machines and Evolutionary Automata". In Xin-She Yang (ed.). Artificial Intelligence, Evolutionary Computing and Metaheuristics. Studies in Computational Intelligence. Vol. 427. Springer-Verlag. pp. 201–230. doi:10.1007/978-3-642-29694-9_9. ISBN   978-3-642-29693-2.
  17. Burgin, M. and Eberbach, E. (2010) Bounded and Periodic Evolutionary Machines, in Proc. 2010 Congress on Evolutionary Computation (CEC'2010), Barcelona, Spain, 2010, pp. 1379-1386
  18. 1 2 Burgin, M.; Eberbach, E. (2012). "Evolutionary Automata: Expressiveness and Convergence of Evolutionary Computation". The Computer Journal. 55 (9): 1023–1029. doi:10.1093/comjnl/bxr099.
  19. Eberbach E. (2002) On Expressiveness of Evolutionary Computation: Is EC Algorithmic?, Proc. 2002 World Congress on Computational Intelligence WCCI’2002, Honolulu, HI, 2002, 564-569.
  20. Eberbach, E. (2005) Toward a theory of evolutionary computation, BioSystems, v. 82, pp. 1-19.
  21. Eberbach, Eugene; Burgin, Mark (2009). "Evolutionary automata as foundation of evolutionary computation: Larry Fogel was right". 2009 IEEE Congress on Evolutionary Computation. IEEE. pp. 2149–2156. doi:10.1109/CEC.2009.4983207. ISBN   978-1-4244-2958-5. S2CID   2869386.
  22. Hopcroft, J.E., R. Motwani, and J.D. Ullman (2001) Introduction to Automata Theory, Languages, and Computation, Addison Wesley, Boston/San Francisco/New York
  23. J.J. Merelo and C. Cotta (2007). "Who is the best connected EC researcher? Centrality analysis of the complex network of authors in evolutionary computation". arXiv: 0708.2021 [cs.CY].