Effective fitness

Last updated

In natural evolution and artificial evolution (e.g. artificial life and evolutionary computation) the fitness (or performance or objective measure) of a schema is rescaled to give its effective fitness which takes into account crossover and mutation.

Contents

Effective fitness is used in Evolutionary Computation to understand population dynamics. [1] While a biological fitness function only looks at reproductive success, an effective fitness function tries to encompass things that are needed to be fulfilled for survival on population level. [2] In homogeneous populations, reproductive fitness and effective fitness are equal. [1] When a population moves away from homogeneity a higher effective fitness is reached for the recessive genotype. This advantage will decrease while the population moves toward an equilibrium. [1] The deviation from this equilibrium displays how close the population is to achieving a steady state. [1]  When this equilibrium is reached, the maximum effective fitness of the population is achieved. [3]

Problem solving with evolutionary computation is realized with a cost function. [4] If cost functions are applied to swarm optimization they are called a fitness function. Strategies like reinforcement learning [5] and NEAT neuroevolution [6] are creating a fitness landscape which describes the reproductive success of cellular automata. [7] [8]

The effective fitness function models the number of fit offspring [1] and is used in calculations that include evolutionary processes, such as mutation and crossover, important on the population level. [9]

The effective fitness model is superior to its predecessor, the standard reproductive fitness model. It advances in the qualitatively and quantitatively understanding of evolutionary concepts like bloat, self-adaptation, and evolutionary robustness. [3] While reproductive fitness only looks at pure selection, effective fitness describes the flow of a population and natural selection by taking genetic operators into account. [1] [3]

A normal fitness function fits to a problem, [10] while an effective fitness function is an assumption if the objective was reached. [11] The difference is important for designing fitness functions with algorithms like novelty search in which the objective of the agents is unknown. [12] [13] In the case of bacteria effective fitness could include production of toxins and rate of mutation of different plasmids, which are mostly stochastically determined [14]

Applications

When evolutionary equations of the studied population dynamics are available, one can algorithmically compute the effective fitness of a given population. Though the perfect effective fitness model is yet to be found, it is already known to be a good framework to the better understanding of the moving of the genotype-phenotype map, population dynamics, and the flow on fitness landscapes. [1] [3]

Models using a combination of Darwinian fitness functions and effective functions are better at predicting population trends. Effective models could be used to determine therapeutic outcomes of disease treatment. [15] Other models could determine effective protein engineering and works towards finding novel or heightened biochemistry. [16]

Related Research Articles

<span class="mw-page-title-main">Genetic algorithm</span> Competitive algorithm for searching a problem space

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, causal inference, etc.

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

Population genetics is a subfield of genetics that deals with genetic differences within and among populations, and is a part of evolutionary biology. Studies in this branch of biology examine such phenomena as adaptation, speciation, and population structure.

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

In evolutionary biology, fitness landscapes or adaptive landscapes are used to visualize the relationship between genotypes and reproductive success. It is assumed that every genotype has a well-defined replication rate. This fitness is the "height" of the landscape. Genotypes which are similar are said to be "close" to each other, while those that are very different are "far" from each other. The set of all possible genotypes, their degree of similarity, and their related fitness values is then called a fitness landscape. The idea of a fitness landscape is a metaphor to help explain flawed forms in evolution by natural selection, including exploits and glitches in animals like their reactions to supernormal stimuli.

NeuroEvolution of Augmenting Topologies (NEAT) is a genetic algorithm (GA) for the generation of evolving artificial neural networks developed by Kenneth Stanley and Risto Miikkulainen in 2002 while at The University of Texas at Austin. It alters both the weighting parameters and structures of networks, attempting to find a balance between the fitness of evolved solutions and their diversity. It is based on applying three key techniques: tracking genes with history markers to allow crossover among topologies, applying speciation to preserve innovations, and developing topologies incrementally from simple initial structures ("complexifying").

Tournament selection is a method of selecting an individual from a population of individuals in a genetic algorithm. Tournament selection involves running several "tournaments" among a few individuals chosen at random from the population. The winner of each tournament is selected for crossover. Selection pressure, a probabilistic measure of a chromosome's likelihood of participation in the tournament based on the participant selection pool size, is easily adjusted by changing the tournament size. The reason is that if the tournament size is larger, weak individuals have a smaller chance to be selected, because, if a weak individual is selected to be in a tournament, there is a higher probability that a stronger individual is also in that tournament.

Evolvability is defined as the capacity of a system for adaptive evolution. Evolvability is the ability of a population of organisms to not merely generate genetic diversity, but to generate adaptive genetic diversity, and thereby evolve through natural selection.

In computer science, an evolution strategy (ES) is an optimization technique based on ideas of evolution. It belongs to the general class of evolutionary computation or artificial evolution methodologies.

In population genetics and population ecology, population size is a countable quantity representing the number of individual organisms in a population. Population size is directly associated with amount of genetic drift, and is the underlying cause of effects like population bottlenecks and the founder effect. Genetic drift is the major source of decrease of genetic diversity within populations which drives fixation and can potentially lead to speciation events.

In genetics, underdominance, also known as homozygote advantage, heterozygote disadvantage, or negative overdominance," is the opposite of overdominance. It is the selection against the heterozygote, causing disruptive selection and divergent genotypes. Underdominance exists in situations where the heterozygotic genotype is inferior in fitness to either the dominant or recessive homozygotic genotype. Compared to examples of overdominance in actual populations, underdominance is considered more unstable and may lead to the fixation of either allele.

Extremal optimization (EO) is an optimization heuristic inspired by the Bak–Sneppen model of self-organized criticality from the field of statistical physics. This heuristic was designed initially to address combinatorial optimization problems such as the travelling salesman problem and spin glasses, although the technique has been demonstrated to function in optimization domains.

The learnable evolution model (LEM) is a non-Darwinian methodology for evolutionary computation that employs machine learning to guide the generation of new individuals. Unlike standard, Darwinian-type evolutionary computation methods that use random or semi-random operators for generating new individuals, LEM employs hypothesis generation and instantiation operators.

In applied mathematics, multimodal optimization deals with optimization tasks that involve finding all or most of the multiple solutions of a problem, as opposed to a single best solution. Evolutionary multimodal optimization is a branch of evolutionary computation, which is closely related to machine learning. Wong provides a short survey, wherein the chapter of Shir and the book of Preuss cover the topic in more detail.

<span class="mw-page-title-main">Robustness (evolution)</span> Persistence of a biological trait under uncertain conditions

In evolutionary biology, robustness of a biological system is the persistence of a certain characteristic or trait in a system under perturbations or conditions of uncertainty. Robustness in development is known as canalization. According to the kind of perturbation involved, robustness can be classified as mutational, environmental, recombinational, or behavioral robustness etc. Robustness is achieved through the combination of many genetic and molecular mechanisms and can evolve by either direct or indirect selection. Several model systems have been developed to experimentally study robustness and its evolutionary consequences.

A neutral network is a set of genes all related by point mutations that have equivalent function or fitness. Each node represents a gene sequence and each line represents the mutation connecting two sequences. Neutral networks can be thought of as high, flat plateaus in a fitness landscape. During neutral evolution, genes can randomly move through neutral networks and traverse regions of sequence space which may have consequences for robustness and evolvability.

<span class="mw-page-title-main">Epistasis</span> Dependence of a gene mutations phenotype on mutations in other genes

Epistasis is a phenomenon in genetics in which the effect of a gene mutation is dependent on the presence or absence of mutations in one or more other genes, respectively termed modifier genes. In other words, the effect of the mutation is dependent on the genetic background in which it appears. Epistatic mutations therefore have different effects on their own than when they occur together. Originally, the term epistasis specifically meant that the effect of a gene variant is masked by that of different gene.

<span class="mw-page-title-main">Emma Hart (computer scientist)</span> English computer scientist

Professor Emma Hart, FRSE is an English computer scientist known for her work in artificial immune systems (AIS), evolutionary computation and optimisation. She is a professor of computational intelligence at Edinburgh Napier University, editor-in-chief of the Journal of Evolutionary Computation, and D. Coordinator of the Future & Emerging Technologies (FET) Proactive Initiative, Fundamentals of Collective Adaptive Systems.

Eukaryote hybrid genomes result from interspecific hybridization, where closely related species mate and produce offspring with admixed genomes. The advent of large-scale genomic sequencing has shown that hybridization is common, and that it may represent an important source of novel variation. Although most interspecific hybrids are sterile or less fit than their parents, some may survive and reproduce, enabling the transfer of adaptive variants across the species boundary, and even result in the formation of novel evolutionary lineages. There are two main variants of hybrid species genomes: allopolyploid, which have one full chromosome set from each parent species, and homoploid, which are a mosaic of the parent species genomes with no increase in chromosome number.

<span class="mw-page-title-main">Gabriela Ochoa</span> Venezuelan British computer scientist

Gabriela Ochoa is a Venezuelan British computer scientist and Professor at the University of Stirling. Her research considers evolutionary algorithms and heuristic search methods.

References

  1. 1 2 3 4 5 6 7 Stephens CR (1999). ""Effective" fitness landscapes for evolutionary systems". Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406). pp. 703–714. arXiv: nlin/0006050 . doi:10.1109/CEC.1999.782002. ISBN   0-7803-5536-9. S2CID   10062119.
  2. von Bronk B, Schaffer SA, Götz A, Opitz M (May 2017). Balaban N (ed.). "Effects of stochasticity and division of labor in toxin production on two-strain bacterial competition in Escherichia coli". PLOS Biology. 15 (5): e2001457. doi: 10.1371/journal.pbio.2001457 . PMC   5411026 . PMID   28459803.
  3. 1 2 3 4 Stephens CR, Vargas JM (2000). "Effective Fitness as an Alternative Paradigm for Evolutionary Computation I: General Formalism". Genetic Programming and Evolvable Machines. 1 (4): 363–378. doi:10.1023/A:1010017207202. S2CID   1511583.
  4. Schaffer JD, Sichtig HM, Laramee C (2009). A series of failed and partially successful fitness functions for evolving spiking neural networks. Proceedings of the 11th annual conference companion on Genetic and evolutionary computation conference - GECCO 09. ACM Press. doi:10.1145/1570256.1570378.
  5. Afanasyeva A, Buzdalov M (2012). Optimization with auxiliary criteria using evolutionary algorithms and reinforcement learning. Proceedings of 18th International Conference on Soft Computing MENDEL 2012. Vol. 2012. pp. 58–63.
  6. Divband Soorati M, Hamann H (2015). The Effect of Fitness Function Design on Performance in Evolutionary Robotics. Proceedings of the 2015 on Genetic and Evolutionary Computation Conference - GECCO 15. ACM Press. doi:10.1145/2739480.2754676.
  7. Stadler PF, Stephens CR (2003). "Landscapes and Effective Fitness". Comments on Theoretical Biology. Informa UK Limited. 8 (4–5): 389–431. doi:10.1080/08948550302439.
  8. Bagnoli F (1998). "Cellular automata". arXiv: cond-mat/9810012 .
  9. Henry A, Hemery M, François P (June 2018). "φ-evo: A program to evolve phenotypic models of biological networks". PLOS Computational Biology. 14 (6): e1006244. Bibcode:2018PLSCB..14E6244H. doi: 10.1371/journal.pcbi.1006244 . PMC   6013240 . PMID   29889886.
  10. Fernandez AC (2017). "Creating a fitness function that is the right fit for the problem at hand".{{cite journal}}: Cite journal requires |journal= (help)
  11. Handa H (2006). Fitness function for finding out robust solutions on time-varying functions. Proceedings of the 8th annual conference on Genetic and evolutionary computation GECCO 06. ACM Press. CiteSeerX   10.1.1.421.930 . doi:10.1145/1143997.1144186.
  12. Lehman J, Stanley KO (2011). "Abandoning objectives: evolution through the search for novelty alone". Evolutionary Computation. MIT Press - Journals. 19 (2): 189–223. doi:10.1162/evco_a_00025. PMID   20868264. S2CID   12129661.
  13. Woolley BF, Stanley KO (2012). "Exploring promising stepping stones by combining novelty search with interactive evolution". arXiv: 1207.6682 [cs.NE].
  14. Lehman J, Stanley KO (2010-09-24). "Abandoning objectives: evolution through the search for novelty alone". Evolutionary Computation. 19 (2): 189–223. doi:10.1162/EVCO_a_00025. PMID   20868264. S2CID   12129661.
  15. Mahdipour-Shirayeh A, Kaveh K, Kohandel M, Sivaloganathan S (2017-10-30). "Phenotypic heterogeneity in modeling cancer evolution". PLOS ONE. 12 (10): e0187000. arXiv: 1610.08163 . Bibcode:2017PLoSO..1287000M. doi: 10.1371/journal.pone.0187000 . PMC   5662227 . PMID   29084232.
  16. Xu Y, Hu C, Dai Y, Liang J (2014-08-11). "On simplified global nonlinear function for fitness landscape: a case study of inverse protein folding". PLOS ONE. 9 (8): e104403. Bibcode:2014PLoSO...9j4403X. doi: 10.1371/journal.pone.0104403 . PMC   4128808 . PMID   25110986.