Compositional pattern-producing network

Last updated

Compositional pattern-producing networks (CPPNs) are a variation of artificial neural networks (ANNs) that have an architecture whose evolution is guided by genetic algorithms. [1]

Contents

While ANNs often contain only sigmoid functions and sometimes Gaussian functions, CPPNs can include both types of functions and many others. The choice of functions for the canonical set can be biased toward specific types of patterns and regularities. For example, periodic functions such as sine produce segmented patterns with repetitions, while symmetric functions such as Gaussian produce symmetric patterns. Linear functions can be employed to produce linear or fractal-like patterns. Thus, the architect of a CPPN-based genetic art system can bias the types of patterns it generates by deciding the set of canonical functions to include.

Furthermore, unlike typical ANNs, CPPNs are applied across the entire space of possible inputs so that they can represent a complete image. Since they are compositions of functions, CPPNs in effect encode images at infinite resolution and can be sampled for a particular display at whatever resolution is optimal.

CPPNs can be evolved through neuroevolution techniques such as neuroevolution of augmenting topologies (called CPPN-NEAT).

CPPNs have been shown to be a very powerful encoding when evolving the following:

See also

Bibliography


Related Research Articles

<span class="mw-page-title-main">Neural network (machine learning)</span> Computational model used in machine learning, based on connected, hierarchical functions

In machine learning, a neural network is a model inspired by the structure and function of biological neural networks in animal brains.

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

<span class="mw-page-title-main">Evolutionary art</span> Art generated by an iterated process

Evolutionary art is a branch of generative art, in which the artist does not do the work of constructing the artwork, but rather lets a system do the construction. In evolutionary art, initially generated art is put through an iterated process of selection and modification to arrive at a final product, where it is the artist who is the selective agent.

<span class="mw-page-title-main">Evolutionary computation</span> Trial and error problem solvers with a metaheuristic or stochastic optimization character

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

NeuroEvolution of Augmenting Topologies (NEAT) is a genetic algorithm (GA) for the generation of evolving artificial neural networks developed by Kenneth Stanley and Risto Miikkulainen in 2002 while at The University of Texas at Austin. It alters both the weighting parameters and structures of networks, attempting to find a balance between the fitness of evolved solutions and their diversity. It is based on applying three key techniques: tracking genes with history markers to allow crossover among topologies, applying speciation to preserve innovations, and developing topologies incrementally from simple initial structures ("complexifying").

NEAT may refer to:

Neuroevolution, or neuro-evolution, is a form of artificial intelligence that uses evolutionary algorithms to generate artificial neural networks (ANN), parameters, and rules. It is most commonly applied in artificial life, general game playing and evolutionary robotics. The main benefit is that neuroevolution can be applied more widely than supervised learning algorithms, which require a syllabus of correct input-output pairs. In contrast, neuroevolution requires only a measure of a network's performance at a task. For example, the outcome of a game can be easily measured without providing labeled examples of desired strategies. Neuroevolution is commonly used as part of the reinforcement learning paradigm, and it can be contrasted with conventional deep learning techniques that use backpropagation with a fixed topology.

In natural evolution and artificial evolution the fitness of a schema is rescaled to give its effective fitness which takes into account crossover and mutation.

Computational neurogenetic modeling (CNGM) is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biology, as well as engineering.

Neat may refer to:

<span class="mw-page-title-main">NEAT Particles</span>

NEAT Particles is an interactive evolutionary computation program that enables users to evolve particle systems intended for use as special effects in video games or movie graphics. Rather than being hand-coded like typical particle systems, the behaviors of NEAT Particle effects are evolved by user preference. Therefore, non-programmer, non-artist users may evolve complex and unique special effects in real time. NEAT Particles is meant to augment and assist the time-consuming computer graphics content generation process. NEAT is short for Neuroevolution of Augmenting Topologies.

Evolutionary acquisition of neural topologies (EANT/EANT2) is an evolutionary reinforcement learning method that evolves both the topology and weights of artificial neural networks. It is closely related to the works of Angeline et al. and Stanley and Miikkulainen. Like the work of Angeline et al., the method uses a type of parametric mutation that comes from evolution strategies and evolutionary programming, in which adaptive step sizes are used for optimizing the weights of the neural networks. Similar to the work of Stanley (NEAT), the method starts with minimal structures which gain complexity along the evolution path.

Artificial development, also known as artificial embryogeny or machine intelligence or computational development, is an area of computer science and engineering concerned with computational models motivated by genotype–phenotype mappings in biological systems. Artificial development is often considered a sub-field of evolutionary computation, although the principles of artificial development have also been used within stand-alone computational models.

<span class="mw-page-title-main">HyperNEAT</span>

Hypercube-based NEAT, or HyperNEAT, is a generative encoding that evolves artificial neural networks (ANNs) with the principles of the widely used NeuroEvolution of Augmented Topologies (NEAT) algorithm developed by Kenneth Stanley. It is a novel technique for evolving large-scale neural networks using the geometric regularities of the task domain. It uses Compositional Pattern Producing Networks (CPPNs), which are used to generate the images for Picbreeder.orgArchived 2011-07-25 at the Wayback Machine and shapes for EndlessForms.comArchived 2018-11-14 at the Wayback Machine. HyperNEAT has recently been extended to also evolve plastic ANNs and to evolve the location of every neuron in the network.

The promoter based genetic algorithm (PBGA) is a genetic algorithm for neuroevolution developed by F. Bellas and R.J. Duro in the Integrated Group for Engineering Research (GII) at the University of Coruña, in Spain. It evolves variable size feedforward artificial neural networks (ANN) that are encoded into sequences of genes for constructing a basic ANN unit. Each of these blocks is preceded by a gene promoter acting as an on/off switch that determines if that particular unit will be expressed or not.

<span class="mw-page-title-main">Encog</span> Machine learning framework

Encog is a machine learning framework available for Java and .Net. Encog supports different learning algorithms such as Bayesian Networks, Hidden Markov Models and Support Vector Machines. However, its main strength lies in its neural network algorithms. Encog contains classes to create a wide variety of networks, as well as support classes to normalize and process data for these neural networks. Encog trains using many different techniques. Multithreading is used to allow optimal training performance on multicore machines.

There are many types of artificial neural networks (ANN).

The following outline is provided as an overview of, and topical guide to, machine learning:

Artificial intelligence and machine learning techniques are used in video games for a wide variety of applications such as non-player character (NPC) control and procedural content generation (PCG). Machine learning is a subset of artificial intelligence that uses historical data to build predictive and analytical models. This is in sharp contrast to traditional methods of artificial intelligence such as search trees and expert systems.

Kenneth Owen Stanley is an artificial intelligence researcher, author, and former professor of computer science at the University of Central Florida known for creating the Neuroevolution of augmenting topologies (NEAT) algorithm. He coauthored Why Greatness Cannot Be Planned: The Myth of the Objective with Joel Lehman which argues for the existence of the "objective paradox", a paradox which states that "soon as you create an objective, you ruin your ability to reach it". While a professor at the University of Central Florida, he was the director of the Evolutionary Complexity Research Group (EPlex) which led the development of Galactic Arms Race. He also developed the HyperNEAT, CPPNs, and novelty search algorithms. He also co-founded Geometric Intelligence, an AI research firm, in 2015.

References

  1. Stanley, Kenneth O. "Compositional pattern producing networks: A novel abstraction of development." Genetic programming and evolvable machines 8.2 (2007): 131-162.