Genetic programming

Last updated

Genetic programming (GP) is an evolutionary algorithm, an artificial intelligence technique mimicking natural evolution, which operates on a population of programs. It applies the genetic operators selection according to a predefined fitness measure, mutation and crossover.

Contents

The crossover operation involves swapping specified parts of selected pairs (parents) to produce new and different offspring that become part of the new generation of programs. Some programs not selected for reproduction are copied from the current generation to the new generation. Mutation involves substitution of some random part of a program with some other random part of a program. Then the selection and other operations are recursively applied to the new generation of programs.

Typically, members of each new generation are on average more fit than the members of the previous generation, and the best-of-generation program is often better than the best-of-generation programs from previous generations. Termination of the evolution usually occurs when some individual program reaches a predefined proficiency or fitness level.

It may and often does happen that a particular run of the algorithm results in premature convergence to some local maximum which is not a globally optimal or even good solution. Multiple runs (dozens to hundreds) are usually necessary to produce a very good result. It may also be necessary to have a large starting population size and variability of the individuals to avoid pathologies.

History

The first record of the proposal to evolve programs is probably that of Alan Turing in 1950. [1] There was a gap of 25 years before the publication of John Holland's 'Adaptation in Natural and Artificial Systems' laid out the theoretical and empirical foundations of the science. In 1981, Richard Forsyth demonstrated the successful evolution of small programs, represented as trees, to perform classification of crime scene evidence for the UK Home Office. [2]

Although the idea of evolving programs, initially in the computer language Lisp, was current amongst John Holland's students, [3] it was not until they organised the first Genetic Algorithms (GA) conference in Pittsburgh that Nichael Cramer [4] published evolved programs in two specially designed languages, which included the first statement of modern "tree-based" Genetic Programming (that is, procedural languages organized in tree-based structures and operated on by suitably defined GA-operators). In 1988, John Koza (also a PhD student of John Holland) patented his invention of a GA for program evolution. [5] This was followed by publication in the International Joint Conference on Artificial Intelligence IJCAI-89. [6]

Koza followed this with 205 publications on “Genetic Programming” (GP), name coined by David Goldberg, also a PhD student of John Holland. [7] However, it is the series of 4 books by Koza, starting in 1992 [8] with accompanying videos, [9] that really established GP. Subsequently, there was an enormous expansion of the number of publications with the Genetic Programming Bibliography, surpassing 10,000 entries. [10] In 2010, Koza [11] listed 77 results where Genetic Programming was human competitive.

In 1996, Koza started the annual Genetic Programming conference [12] which was followed in 1998 by the annual EuroGP conference, [13] and the first book [14] in a GP series edited by Koza. 1998 also saw the first GP textbook. [15] GP continued to flourish, leading to the first specialist GP journal [16] and three years later (2003) the annual Genetic Programming Theory and Practice (GPTP) workshop was established by Rick Riolo. [17] [18] Genetic Programming papers continue to be published at a diversity of conferences and associated journals. Today there are nineteen GP books including several for students. [15]

Timeline of EP - selected algorithms [19]
YearDescriptionReference
1992Introduction of GP as genetically bred populations of computer programs [20]
2000 Cartesian genetic programming [21]
2000Grammar-guided GP - Dynamic grammar pruning is applied in initialization [22]
2001 Gene expression programming [23]
2012Multi-gene GP - Combination of classical method for parameter estimation and structure selection [24]
2012Geometric semantic GP - Direct search in the space of the underlying semantics of the programs [25]
2015Surrogate GP [26]
2015Memetic semantic GP [27]
2017Statistical GP - statistical information used to generate well-structured subtrees [28]
2018Multi-dimensional GP - novel program representation for multi-dimensional features [29]

Foundational work in GP

Early work that set the stage for current genetic programming research topics and applications is diverse, and includes software synthesis and repair, predictive modeling, data mining, [30] financial modeling, [31] soft sensors, [32] design, [33] and image processing. [34] Applications in some areas, such as design, often make use of intermediate representations, [35] such as Fred Gruau's cellular encoding. [36] Industrial uptake has been significant in several areas including finance, the chemical industry, bioinformatics [37] [38] and the steel industry. [39]

Methods

Program representation

A function represented as a tree structure Genetic Program Tree.png
A function represented as a tree structure

GP evolves computer programs, traditionally represented in memory as tree structures. [40] Trees can be easily evaluated in a recursive manner. Every internal node has an operator function and every terminal node has an operand, making mathematical expressions easy to evolve and evaluate. Thus traditionally GP favors the use of programming languages that naturally embody tree structures (for example, Lisp; other functional programming languages are also suitable).

Non-tree representations have been suggested and successfully implemented, such as linear genetic programming which perhaps suits the more traditional imperative languages. [41] [42] The commercial GP software Discipulus uses automatic induction of binary machine code ("AIM") [43] to achieve better performance. μGP [44] uses directed multigraphs to generate programs that fully exploit the syntax of a given assembly language. Multi expression programming uses Three-address code for encoding solutions. Other program representations on which significant research and development have been conducted include programs for stack-based virtual machines, [45] [46] [47] and sequences of integers that are mapped to arbitrary programming languages via grammars. [48] [49] Cartesian genetic programming is another form of GP, which uses a graph representation instead of the usual tree based representation to encode computer programs.

Most representations have structurally noneffective code (introns). Such non-coding genes may seem to be useless because they have no effect on the performance of any one individual. However, they alter the probabilities of generating different offspring under the variation operators, and thus alter the individual's variational properties. Experiments seem to show faster convergence when using program representations that allow such non-coding genes, compared to program representations that do not have any non-coding genes. [50] [51] Instantiations may have both trees with introns and those without; the latter are called canonical trees. Special canonical crossover operators are introduced that maintain the canonical structure of parents in their children.

Initialisation

The methods for creation of the initial population include: [52]

Selection

Selection is a process whereby certain individuals are selected from the current generation that would serve as parents for the next generation. The individuals are selected probabilistically such that the better performing individuals have a higher chance of getting selected. [18] The most commonly used selection method in GP is tournament selection, although other methods such as fitness proportionate selection, lexicase selection, [53] and others have been demonstrated to perform better for many GP problems.

Elitism, which involves seeding the next generation with the best individual (or best n individuals) from the current generation, is a technique sometimes employed to avoid regression.

Crossover

Genetic programming subtree crossover Genetic programming subtree crossover.gif
Genetic programming subtree crossover

In Genetic Programming two fit individuals are chosen from the population to be parents for one or two children. In tree genetic programming, these parents are represented as inverted lisp like trees, with their root nodes at the top. In subtree crossover in each parent a subtree is randomly chosen. (Highlighted with yellow in the animation.) In the root donating parent (in the animation on the left) the chosen subtree is removed and replaced with a copy of the randomly chosen subtree from the other parent, to give a new child tree.

Sometimes two child crossover is used, in which case the removed subtree (in the animation on the left) is not simply deleted but is copied to a copy of the second parent (here on the right) replacing (in the copy) its randomly chosen subtree. Thus this type of subtree crossover takes two fit trees and generates two child trees.

Replication

Some individuals selected according to fitness criteria do not participate in crossover, but are copied into the next generation, akin to asexual reproduction in the natural world. They may be further subject to mutation.

Mutation

Animation of creating genetic programing child by mutating parent removing subtree and replacing with random code Genetic programming mutation.gif
Animation of creating genetic programing child by mutating parent removing subtree and replacing with random code

There are many types of mutation in genetic programming. They start from a fit syntactically correct parent and aim to randomly create a syntactically correct child. In the animation a subtree is randomly chosen (highlighted by yellow). It is removed and replaced by a randomly generated subtree.

Other mutation operators select a leaf (external node) of the tree and replace it with a randomly chosen leaf. Another mutation is to select at random a function (internal node) and replace it with another function with the same arity (number of inputs). Hoist mutation randomly chooses a subtree and replaces it with a subtree within itself. Thus hoist mutation is guaranteed to make the child smaller. Leaf and same arity function replacement ensure the child is the same size as the parent. Whereas subtree mutation (in the animation) may, depending upon the function and terminal sets, have a bias to either increase or decrease the tree size. Other subtree based mutations try to carefully control the size of the replacement subtree and thus the size of the child tree.

Similarly there are many types of linear genetic programming mutation, each of which tries to ensure the mutated child is still syntactically correct.

Applications

GP has been successfully used as an automatic programming tool, a machine learning tool and an automatic problem-solving engine. [18] GP is especially useful in the domains where the exact form of the solution is not known in advance or an approximate solution is acceptable (possibly because finding the exact solution is very difficult). Some of the applications of GP are curve fitting, data modeling, symbolic regression, feature selection, classification, etc. John R. Koza mentions 76 instances where Genetic Programming has been able to produce results that are competitive with human-produced results (called Human-competitive results). [54] Since 2004, the annual Genetic and Evolutionary Computation Conference (GECCO) holds Human Competitive Awards (called Humies) competition, [55] where cash awards are presented to human-competitive results produced by any form of genetic and evolutionary computation. GP has won many awards in this competition over the years.

Meta-genetic programming

Meta-genetic programming is the proposed meta-learning technique of evolving a genetic programming system using genetic programming itself. It suggests that chromosomes, crossover, and mutation were themselves evolved, therefore like their real life counterparts should be allowed to change on their own rather than being determined by a human programmer. Meta-GP was formally proposed by Jürgen Schmidhuber in 1987. [56] Doug Lenat's Eurisko is an earlier effort that may be the same technique. It is a recursive but terminating algorithm, allowing it to avoid infinite recursion. In the "autoconstructive evolution" approach to meta-genetic programming, the methods for the production and variation of offspring are encoded within the evolving programs themselves, and programs are executed to produce new programs to be added to the population. [46] [57]

Critics of this idea often say this approach is overly broad in scope. However, it might be possible to constrain the fitness criterion onto a general class of results, and so obtain an evolved GP that would more efficiently produce results for sub-classes. This might take the form of a meta evolved GP for producing human walking algorithms which is then used to evolve human running, jumping, etc. The fitness criterion applied to the meta GP would simply be one of efficiency.

See also

Related Research Articles

<span class="mw-page-title-main">Genetic algorithm</span> Competitive algorithm for searching a problem space

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems via biologically inspired operators such as selection, crossover, and mutation. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, and causal inference.

<span class="mw-page-title-main">Evolutionary algorithm</span> Subset of evolutionary computation

Evolutionary algorithms (EA) reproduce essential elements of the biological evolution in a computer algorithm in order to solve “difficult” problems, at least approximately, for which no exact or satisfactory solution methods are known. They belong to the class of metaheuristics and are a subset of population based bio-inspired algorithms and evolutionary computation, which itself are part of the field of computational intelligence. The mechanisms of biological evolution that an EA mainly imitates are reproduction, mutation, recombination and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function). Evolution of the population then takes place after the repeated application of the above operators.

<span class="mw-page-title-main">Evolutionary computation</span> Trial and error problem solvers with a metaheuristic or stochastic optimization character

Evolutionary computation from computer science is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

NeuroEvolution of Augmenting Topologies (NEAT) is a genetic algorithm (GA) for the generation of evolving artificial neural networks developed by Kenneth Stanley and Risto Miikkulainen in 2002 while at The University of Texas at Austin. It alters both the weighting parameters and structures of networks, attempting to find a balance between the fitness of evolved solutions and their diversity. It is based on applying three key techniques: tracking genes with history markers to allow crossover among topologies, applying speciation to preserve innovations, and developing topologies incrementally from simple initial structures ("complexifying").

<span class="mw-page-title-main">Genetic operator</span>

A genetic operator is an operator used in evolutionary algorithms (EA) to guide the algorithm towards a solution to a given problem. There are three main types of operators, which must work in conjunction with one another in order for the algorithm to be successful. Genetic operators are used to create and maintain genetic diversity, combine existing solutions into new solutions (crossover) and select between solutions (selection).

<span class="mw-page-title-main">Evolutionary programming</span> Evolutionary algorithm with a defined structure

Evolutionary programming is an evolutionary algorithm, where a share of new population is created by mutation of previous population without crossover. Evolutionary programming differs from evolution strategy ES() in one detail. All individuals are selected for the new population, while in ES(), every individual has the same probability to be selected. It is one of the four major evolutionary algorithm paradigms.

<span class="mw-page-title-main">Mutation (evolutionary algorithm)</span> Genetic operation used to add population diversity

Mutation is a genetic operator used to maintain genetic diversity of the chromosomes of a population of an evolutionary algorithm (EA), including genetic algorithms in particular. It is analogous to biological mutation.

<span class="mw-page-title-main">Chromosome (evolutionary algorithm)</span> Set of parameters for a genetic or evolutionary algorithm

A chromosome or genotype in evolutionary algorithms (EA) is a set of parameters which define a proposed solution of the problem that the evolutionary algorithm is trying to solve. The set of all solutions, also called individuals according to the biological model, is known as the population. The genome of an individual consists of one, more rarely of several, chromosomes and corresponds to the genetic representation of the task to be solved. A chromosome is composed of a set of genes, where a gene consists of one or more semantically connected parameters, which are often also called decision variables. They determine one or more phenotypic characteristics of the individual or at least have an influence on them. In the basic form of genetic algorithms, the chromosome is represented as a binary string, while in later variants and in EAs in general, a wide variety of other data structures are used.

In evolutionary computation, a human-based genetic algorithm (HBGA) is a genetic algorithm that allows humans to contribute solution suggestions to the evolutionary process. For this purpose, a HBGA has human interfaces for initialization, mutation, and recombinant crossover. As well, it may have interfaces for selective evaluation. In short, a HBGA outsources the operations of a typical genetic algorithm to humans.

Interactive evolutionary computation (IEC) or aesthetic selection is a general term for methods of evolutionary computation that use human evaluation. Usually human evaluation is necessary when the form of fitness function is not known or the result of optimization should fit a particular user preference.

<span class="mw-page-title-main">Evolution strategy</span>

Evolution strategy (ES) from computer science is a subclass of evolutionary algorithms, which serves as an optimization technique. It uses the major genetic operators mutation, recombination and selection of parents.

<span class="mw-page-title-main">Effective fitness</span> Reproductive success given genetic mutation

In natural evolution and artificial evolution the fitness of a schema is rescaled to give its effective fitness which takes into account crossover and mutation.

<span class="mw-page-title-main">Premature convergence</span>

Premature convergence is an unwanted effect in evolutionary algorithms (EA), a metaheuristic that mimics the basic principles of biological evolution as a computer algorithm for solving an optimization problem. The effect means that the population of an EA has converged too early, resulting in being suboptimal. In this context, the parental solutions, through the aid of genetic operators, are not able to generate offspring that are superior to, or outperform, their parents. Premature convergence is a common problem found in evolutionary algorithms, as it leads to a loss, or convergence of, a large number of alleles, subsequently making it very difficult to search for a specific gene in which the alleles were present. An allele is considered lost if, in a population, a gene is present, where all individuals are sharing the same value for that particular gene. An allele is, as defined by De Jong, considered to be a converged allele, when 95% of a population share the same value for a certain gene.

<span class="mw-page-title-main">Genetic representation</span> Data structure and types for evolutionary computation

In computer programming, genetic representation is a way of presenting solutions/individuals in evolutionary computation methods. The term encompasses both the concrete data structures and data types used to realize the genetic material of the candidate solutions in the form of a genome, and the relationships between search space and problem space. In the simplest case, the search space corresponds to the problem space. The choice of problem representation is tied to the choice of genetic operators, both of which have a decisive effect on the efficiency of the optimization. Genetic representation can encode appearance, behavior, physical qualities of individuals. Difference in genetic representations is one of the major criteria drawing a line between known classes of evolutionary computation.

<span class="mw-page-title-main">Grammatical evolution</span>

Grammatical evolution (GE) is a genetic programming (GP) technique (or approach) from evolutionary computation pioneered by Conor Ryan, JJ Collins and Michael O'Neill in 1998 at the BDS Group in the University of Limerick.

Evolutionary music is the audio counterpart to evolutionary art, whereby algorithmic music is created using an evolutionary algorithm. The process begins with a population of individuals which by some means or other produce audio, which is either initialized randomly or based on human-generated music. Then through the repeated application of computational steps analogous to biological selection, recombination and mutation the aim is for the produced audio to become more musical. Evolutionary sound synthesis is a related technique for generating sounds or synthesizer instruments. Evolutionary music is typically generated using an interactive evolutionary algorithm where the fitness function is the user or audience, as it is difficult to capture the aesthetic qualities of music computationally. However, research into automated measures of musical quality is also active. Evolutionary computation techniques have also been applied to harmonization and accompaniment tasks. The most commonly used evolutionary computation techniques are genetic algorithms and genetic programming.

ECJ is a freeware evolutionary computation research system written in Java. It is a framework that supports a variety of evolutionary computation techniques, such as genetic algorithms, genetic programming, evolution strategies, coevolution, particle swarm optimization, and differential evolution. The framework models iterative evolutionary processes using a series of pipelines arranged to connect one or more subpopulations of individuals with selection, breeding (such as crossover, and mutation operators that produce new individuals. The framework is open source and is distributed under the Academic Free License. ECJ was created by Sean Luke, a computer science professor at George Mason University, and is maintained by Sean Luke and a variety of contributors.

Evolutionary biology, in particular the understanding of how organisms evolve through natural selection, is an area of science with many practical applications. Creationists often claim that the theory of evolution lacks any practical applications; however, this claim has been refuted by scientists.

<span class="mw-page-title-main">Fly algorithm</span>

The Fly Algorithm is a computational method within the field of evolutionary algorithms, designed for direct exploration of 3D spaces in applications such as computer stereo vision, robotics, and medical imaging. Unlike traditional image-based stereovision, which relies on matching features to construct 3D information, the Fly Algorithm operates by generating a 3D representation directly from random points, termed "flies." Each fly is a coordinate in 3D space, evaluated for its accuracy by comparing its projections in a scene. By iteratively refining the positions of flies based on fitness criteria, the algorithm can construct an optimized spatial representation. The Fly Algorithm has expanded into various fields, including applications in digital art, where it is used to generate complex visual patterns.

<span class="mw-page-title-main">Gabriela Ochoa</span> Venezuelan British computer scientist

Gabriela Ochoa is a Venezuelan British computer scientist and Professor at the University of Stirling. Her research considers evolutionary algorithms and heuristic search methods.

References

  1. "Computing Machinery and Intelligence". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  2. "BEAGLE A Darwinian Approach to Pattern Recognition". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  3. A personal communication with Tom Westerdale
  4. "A representation for the Adaptive Generation of Simple Sequential Programs". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  5. "Non-Linear Genetic Algorithms for Solving Problems". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  6. "Hierarchical genetic algorithms operating on populations of computer programs". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  7. Goldberg. D.E. (1983), Computer-aided gas pipeline operation using genetic algorithms and rule learning. Dissertation presented to the University of Michigan at Ann Arbor, Michigan, in partial fulfillment of the requirements for Ph.D.
  8. "Genetic Programming: On the Programming of Computers by Means of Natural Selection". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  9. "Genetic Programming:The Movie". gpbib.cs.ucl.ac.uk. Archived from the original on 2021-12-11. Retrieved 2021-05-20.
  10. "The effects of recombination on phenotypic exploration and robustness in evolution". gpbib.cs.ucl.ac.uk. Retrieved 2021-05-20.
  11. "Human-competitive results produced by genetic programming". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  12. "Genetic Programming 1996: Proceedings of the First Annual Conference". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  13. "Genetic Programming". www.cs.bham.ac.uk. Retrieved 2018-05-19.
  14. "Genetic Programming and Data Structures: Genetic Programming + Data Structures = Automatic Programming!". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  15. 1 2 "Genetic Programming -- An Introduction; On the Automatic Evolution of Computer Programs and its Applications". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  16. Banzhaf, Wolfgang (2000-04-01). "Editorial Introduction". Genetic Programming and Evolvable Machines. 1 (1–2): 5–6. doi:10.1023/A:1010026829303. ISSN   1389-2576.
  17. "Genetic Programming Theory and Practice". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  18. 1 2 3 "A Field Guide to Genetic Programming". www.gp-field-guide.org.uk. Retrieved 2018-05-20.
  19. Slowik, Adam; Kwasnicka, Halina (1 August 2020). "Evolutionary algorithms and their applications to engineering problems". Neural Computing and Applications. 32 (16): 12363–12379. doi: 10.1007/s00521-020-04832-8 . ISSN   1433-3058.
  20. Koza, J. R. G. P. (1992). "On the programming of computers by means of natural selection". Genetic Programming.
  21. Miller, Julian F. (2011). "Cartesian Genetic Programming". Natural Computing Series. Springer. pp. 17–34. doi:10.1007/978-3-642-17310-3_2. ISBN   978-3-642-17309-7.{{cite book}}: Missing or empty |title= (help)
  22. Ratle, Alain; Sebag, Michèle (2000). "Genetic Programming and Domain Knowledge: Beyond the Limitations of Grammar-Guided Machine Discovery". Parallel Problem Solving from Nature PPSN VI. Lecture Notes in Computer Science. Vol. 1917. Springer. pp. 211–220. doi:10.1007/3-540-45356-3_21. ISBN   978-3-540-41056-0.
  23. Ferreira, Candida (2001). "Gene Expression Programming: a New Adaptive Algorithm for Solving Problems". arXiv. doi:10.48550/arXiv.cs/0102027.
  24. Gandomi, Amir Hossein; Alavi, Amir Hossein (1 February 2012). "A new multi-gene genetic programming approach to nonlinear system modeling. Part I: materials and structural engineering problems". Neural Computing and Applications. 21 (1): 171–187. doi:10.1007/s00521-011-0734-z. ISSN   1433-3058.
  25. Moraglio, Alberto; Krawiec, Krzysztof; Johnson, Colin G. (2012). "Geometric Semantic Genetic Programming". Parallel Problem Solving from Nature - PPSN XII. Lecture Notes in Computer Science. Vol. 7491. Springer. pp. 21–31. doi:10.1007/978-3-642-32937-1_3. ISBN   978-3-642-32936-4.
  26. Kattan, Ahmed; Ong, Yew-Soon (1 March 2015). "Surrogate Genetic Programming: A semantic aware evolutionary search". Information Sciences. 296: 345–359. doi:10.1016/j.ins.2014.10.053. ISSN   0020-0255.
  27. Ffrancon, Robyn; Schoenauer, Marc (11 July 2015). "Memetic Semantic Genetic Programming". Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation (PDF). Association for Computing Machinery. pp. 1023–1030. doi:10.1145/2739480.2754697. ISBN   978-1-4503-3472-3.
  28. Amir Haeri, Maryam; Ebadzadeh, Mohammad Mehdi; Folino, Gianluigi (1 November 2017). "Statistical genetic programming for symbolic regression". Applied Soft Computing. 60: 447–469. doi:10.1016/j.asoc.2017.06.050. ISSN   1568-4946.
  29. La Cava, William; Silva, Sara; Danai, Kourosh; Spector, Lee; Vanneschi, Leonardo; Moore, Jason H. (1 February 2019). "Multidimensional genetic programming for multiclass classification". Swarm and Evolutionary Computation. 44: 260–272. doi:10.1016/j.swevo.2018.03.015. ISSN   2210-6502.
  30. "Data Mining and Knowledge Discovery with Evolutionary Algorithms". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  31. "EDDIE beats the bookies". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  32. "Applying Computational Intelligence How to Create Value". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  33. "Human-competitive machine invention by means of genetic programming". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  34. "Discovery of Human-Competitive Image Texture Feature Extraction Programs Using Genetic Programming". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  35. "Three Ways to Grow Designs: A Comparison of Embryogenies for an Evolutionary Design Problem". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  36. "Cellular encoding as a graph grammar - IET Conference Publication". IEEE : 17/1–1710. April 1993. Retrieved 2018-05-20.
  37. "Genetic Algorithm Decoding for the Interpretation of Infra-red Spectra in Analytical Biotechnology". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  38. "Genetic Programming for Mining DNA Chip data from Cancer Patients". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  39. "Genetic Programming and Jominy Test Modeling". www.cs.bham.ac.uk. Retrieved 2018-05-20.
  40. Nichael L. Cramer "A Representation for the Adaptive Generation of Simple Sequential Programs" Archived 2005-12-04 at the Wayback Machine .
  41. Genetic Programming: An Introduction, Wolfgang Banzhaf, Peter Nordin, Robert E. Keller, Frank D. Francone, Morgan Kaufmann, 1999. ISBN 978-1558605107
  42. Garnett Wilson and Wolfgang Banzhaf. "A Comparison of Cartesian Genetic Programming and Linear Genetic Programming"
  43. (Peter Nordin, 1997, Banzhaf et al., 1998, Section 11.6.2-11.6.3)
  44. Giovanni Squillero. "μGP (MicroGP)".
  45. "Stack-Based Genetic Programming". gpbib.cs.ucl.ac.uk. Retrieved 2021-05-20.
  46. 1 2 Spector, Lee; Robinson, Alan (2002-03-01). "Genetic Programming and Autoconstructive Evolution with the Push Programming Language". Genetic Programming and Evolvable Machines. 3 (1): 7–40. doi:10.1023/A:1014538503543. ISSN   1389-2576. S2CID   5584377.
  47. Spector, Lee; Klein, Jon; Keijzer, Maarten (2005-06-25). "The Push3 execution stack and the evolution of control". Proceedings of the 7th annual conference on Genetic and evolutionary computation. ACM. pp. 1689–1696. CiteSeerX   10.1.1.153.384 . doi:10.1145/1068009.1068292. ISBN   978-1595930101. S2CID   11954638.
  48. Ryan, Conor; Collins, JJ; Neill, Michael O (1998). Lecture Notes in Computer Science. Berlin, Heidelberg: Springer Berlin Heidelberg. pp. 83–96. CiteSeerX   10.1.1.38.7697 . doi:10.1007/bfb0055930. ISBN   9783540643609.
  49. O'Neill, M.; Ryan, C. (2001). "Grammatical evolution". IEEE Transactions on Evolutionary Computation. 5 (4): 349–358. doi:10.1109/4235.942529. ISSN   1089-778X. S2CID   10391383.
  50. Julian F. Miller. "Cartesian Genetic Programming" Archived 2015-09-24 at the Wayback Machine . p. 19.
  51. Janet Clegg; James Alfred Walker; Julian Francis Miller. A New Crossover Technique for Cartesian Genetic Programming". 2007.
  52. Walker, Matthew (2001). "Introduction to Genetic Programming". Massey University.
  53. Spector, Lee (2012). "Assessment of problem modality by differential performance of lexicase selection in genetic programming". Proceedings of the 14th annual conference companion on Genetic and evolutionary computation. Gecco '12. ACM. pp. 401–408. doi:10.1145/2330784.2330846. ISBN   9781450311786. S2CID   3258264.
  54. Koza, John R (2010). "Human-competitive results produced by genetic programming". Genetic Programming and Evolvable Machines. 11 (3–4): 251–284. doi: 10.1007/s10710-010-9112-3 .
  55. "Humies =Human-Competitive Awards".
  56. "1987 THESIS ON LEARNING HOW TO LEARN, METALEARNING, META GENETIC PROGRAMMING, CREDIT-CONSERVING MACHINE LEARNING ECONOMY".
  57. GECCO '16 Companion : proceedings of the 2016 Genetic and Evolutionary Computation Conference : July 20-24, 2016, Denver, Colorado, USA. Neumann, Frank (Computer scientist), Association for Computing Machinery. SIGEVO. New York, New York. 20 July 2016. ISBN   9781450343237. OCLC   987011786.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: others (link)