Artificial life

Last updated

Artificial life (often abbreviated ALife or A-Life) is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry. [1] The discipline was named by Christopher Langton, an American theoretical biologist, in 1986. [2] In 1987 Langton organized the first conference on the field, in Los Alamos, New Mexico. [3] There are three main kinds of alife, [4] named for their approaches: soft, [5] from software; hard, [6] from hardware; and wet , from biochemistry. Artificial life researchers study traditional biology by trying to recreate aspects of biological phenomena. [7] [8]


A Braitenberg vehicle simulation, programmed in breve, an artificial life simulator Braitenberg vehicle (simulation made with breve).jpg
A Braitenberg vehicle simulation, programmed in breve, an artificial life simulator


Artificial life studies the fundamental processes of living systems in artificial environments in order to gain a deeper understanding of the complex information processing that define such systems. These topics are broad, but often include evolutionary dynamics, emergent properties of collective systems, biomimicry, as well as related issues about the philosophy of the nature of life and the use of lifelike properties in artistic works.


The modeling philosophy of artificial life strongly differs from traditional modeling by studying not only "life-as-we-know-it" but also "life-as-it-might-be". [9]

A traditional model of a biological system will focus on capturing its most important parameters. In contrast, an alife modeling approach will generally seek to decipher the most simple and general principles underlying life and implement them in a simulation. The simulation then offers the possibility to analyse new and different lifelike systems.

Vladimir Georgievich Red'ko proposed to generalize this distinction to the modeling of any process, leading to the more general distinction of "processes-as-we-know-them" and "processes-as-they-could-be". [10]

At present, the commonly accepted definition of life does not consider any current alife simulations or software to be alive, and they do not constitute part of the evolutionary process of any ecosystem. However, different opinions about artificial life's potential have arisen:

Software-based ("soft")


Notable simulators

This is a list of artificial life/digital organism simulators, organized by the method of creature definition.

NameDriven ByStartedEnded
ApeSDK (formerly Noble Ape)language/social simulation1996ongoing
Avida executable DNA1993ongoing
Biogenesisexecutable DNA2006ongoing
Creatures neural net/simulated biochemistry1996-2001Fandom still active to this day, some abortive attempts at new products
Critterdingneural net2005ongoing
Darwinbotsexecutable DNA2003ongoing
DigiHiveexecutable DNA2006ongoing
DOSEexecutable DNA2012ongoing
EcoSim Fuzzy Cognitive Map2009ongoing
Framsticks executable DNA1996ongoing
Gebneural net1997ongoing
OpenWorm Geppetto2011ongoing
Polyworld neural net1990ongoing
Primordial Lifeexecutable DNA19942003
ScriptBotsexecutable DNA2010ongoing
TechnoSphere modules1995
Tierra executable DNA19912004
3D Virtual Creature Evolution neural net2008NA


Program-based simulations contain organisms with a complex DNA language, usually Turing complete. This language is more often in the form of a computer program than actual biological DNA. Assembly derivatives are the most common languages used. An organism "lives" when its code is executed, and there are usually various methods allowing self-replication. Mutations are generally implemented as random changes to the code. Use of cellular automata is common but not required. Another example could be an artificial intelligence and multi-agent system/program.


Individual modules are added to a creature. These modules modify the creature's behaviors and characteristics either directly, by hard coding into the simulation (leg type A increases speed and metabolism), or indirectly, through the emergent interactions between a creature's modules (leg type A moves up and down with a frequency of X, which interacts with other legs to create motion). Generally these are simulators which emphasize user creation and accessibility over mutation and evolution.


Organisms are generally constructed with pre-defined and fixed behaviors that are controlled by various parameters that mutate. That is, each organism contains a collection of numbers or other finite parameters. Each parameter controls one or several aspects of an organism in a well-defined way.

Neural net–based

These simulations have creatures that learn and grow using neural nets or a close derivative. Emphasis is often, although not always, more on learning than on natural selection.

Complex systems modeling

Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models). [12] [13] In black-box models, the individual-based (mechanistic) mechanisms of a complex dynamic system remain hidden.

Mathematical models for complex systems Mathematical models for complex systems.jpg
Mathematical models for complex systems

Black-box models are completely nonmechanistic. They are phenomenological and ignore a composition and internal structure of a complex system. We cannot investigate interactions of subsystems of such a non-transparent model. A white-box model of complex dynamic system has ‘transparent walls’ and directly shows underlying mechanisms. All events at micro-, meso- and macro-levels of a dynamic system are directly visible at all stages of its white-box model evolution. In most cases mathematical modelers use the heavy black-box mathematical methods, which cannot produce mechanistic models of complex dynamic systems. Grey-box models are intermediate and combine black-box and white-box approaches.

Logical deterministic individual-based cellular automata model of single species population growth Logical deterministic individual-based cellular automata model of single species population growth.gif
Logical deterministic individual-based cellular automata model of single species population growth

Creation of a white-box model of complex system is associated with the problem of the necessity of an a priori basic knowledge of the modeling subject. The deterministic logical cellular automata are necessary but not sufficient condition of a white-box model. The second necessary prerequisite of a white-box model is the presence of the physical ontology of the object under study. The white-box modeling represents an automatic hyper-logical inference from the first principles because it is completely based on the deterministic logic and axiomatic theory of the subject. The purpose of the white-box modeling is to derive from the basic axioms a more detailed, more concrete mechanistic knowledge about the dynamics of the object under study. The necessity to formulate an intrinsic axiomatic system of the subject before creating its white-box model distinguishes the cellular automata models of white-box type from cellular automata models based on arbitrary logical rules. If cellular automata rules have not been formulated from the first principles of the subject, then such a model may have a weak relevance to the real problem. [13]

Logical deterministic individual-based cellular automata model of interspecific competition for a single limited resource Logical deterministic individual-based cellular automata model of interspecific competition for a single limited resource.gif
Logical deterministic individual-based cellular automata model of interspecific competition for a single limited resource

Hardware-based ("hard")

Hardware-based artificial life mainly consist of robots, that is, automatically guided machines able to do tasks on their own.

Biochemical-based ("wet")

Biochemical-based life is studied in the field of synthetic biology. It involves e.g. the creation of synthetic DNA. The term "wet" is an extension of the term "wetware".

In May 2019, researchers, in a milestone effort, reported the creation of a new synthetic (possibly artificial) form of viable life, a variant of the bacteria Escherichia coli , by reducing the natural number of 64 codons in the bacterial genome to 59 codons instead, in order to encode 20 amino acids. [14] [15]

Open problems

How does life arise from the nonliving? [16] [17]
What are the potentials and limits of living systems?
How is life related to mind, machines, and culture?
  1. Artificial intelligence has traditionally used a top down approach, while alife generally works from the bottom up. [18]
  2. Artificial chemistry started as a method within the alife community to abstract the processes of chemical reactions.
  3. Evolutionary algorithms are a practical application of the weak alife principle applied to optimization problems. Many optimization algorithms have been crafted which borrow from or closely mirror alife techniques. The primary difference lies in explicitly defining the fitness of an agent by its ability to solve a problem, instead of its ability to find food, reproduce, or avoid death.[ citation needed ] The following is a list of evolutionary algorithms closely related to and used in alife:
  4. Multi-agent system – A multi-agent system is a computerized system composed of multiple interacting intelligent agents within an environment.
  5. Evolutionary art uses techniques and methods from artificial life to create new forms of art.
  6. Evolutionary music uses similar techniques, but applied to music instead of visual art.
  7. Abiogenesis and the origin of life sometimes employ alife methodologies as well.



Alife has had a controversial history. John Maynard Smith criticized certain artificial life work in 1994 as "fact-free science". [19]

See also

Related Research Articles

Conways Game of Life 2D cellular automaton devised by J. H. Conway in 1970

The Game of Life, also known simply as Life, is a cellular automaton devised by the British mathematician John Horton Conway in 1970.

Langtons ant Two-dimensional Turing machine with emergent behavior

Langton's ant is a two-dimensional universal Turing machine with a very simple set of rules but complex emergent behavior. It was invented by Chris Langton in 1986 and runs on a square lattice of black and white cells. The universality of Langton's ant was proven in 2000. The idea has been generalized in several different ways, such as turmites which add more colors and more states.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

Social simulation is a research field that applies computational methods to study issues in the social sciences. The issues explored include problems in computational law, psychology, organizational behavior, sociology, political science, economics, anthropology, geography, engineering, archaeology and linguistics.

An agent-based model (ABM) is a class of computational models for simulating the actions and interactions of autonomous agents with a view to assessing their effects on the system as a whole. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to introduce randomness. Particularly within ecology, ABMs are also called individual-based models (IBMs), and individuals within IBMs may be simpler than fully autonomous agents within ABMs. A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used on non-computing related scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.

Artificial society is the specific agent based computational model for computer simulation in social analysis. It is mostly connected to the theme in complex system, emergence, Monte Carlo method, computational sociology, multi-agent system, and evolutionary programming. The concept itself is simple enough. Actually reaching this conceptual point took a while. Complex mathematical models have been, and are, common; deceivingly simple models only have their roots in the late forties, and took the advent of the microcomputer to really get up to speed.

An artificial chemistry is a chemical-like system that usually consists of objects, called molecules, that interact according to rules resembling chemical reaction rules. Artificial chemistries are created and studied in order to understand fundamental properties of chemical systems, including prebiotic evolution, as well as for developing chemical computing systems. Artificial chemistry is a field within computer science wherein chemical reactions—often biochemical ones—are computer-simulated, yielding insights on evolution, self-assembly, and other biochemical phenomena. The field does not use actual chemicals, and should not be confused with either synthetic chemistry or computational chemistry. Rather, bits of information are used to represent the starting molecules, and the end products are examined along with the processes that led to them. The field originated in artificial life but has shown to be a versatile method with applications in many fields such as chemistry, economics, sociology and linguistics.

Generative science Study of how complex behaviour can be generated by deterministic and finite rules and parameters

Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. An example field of study is how unintended consequences arise in social processes.

<i>In silico</i> Latin phrase

In silico is an expression meaning "performed on computer or via computer simulation" in reference to biological experiments. The phrase was coined in 1987 as an allusion to the Latin phrases in vivo, in vitro, and in situ, which are commonly used in biology and refer to experiments done in living organisms, outside living organisms, and where they are found in nature, respectively.

Christopher Langton American computer scientist

Christopher Gale Langton is an American computer scientist and one of the founders of the field of artificial life. He coined the term in the late 1980s when he organized the first "Workshop on the Synthesis and Simulation of Living Systems" at the Los Alamos National Laboratory in 1987. Following his time at Los Alamos, Langton joined the Santa Fe Institute (SFI), to continue his research on artificial life. He left SFI in the late 1990s, and abandoned his work on artificial life, publishing no research since that time.

Block cellular automaton type of cellular automata

A block cellular automaton or partitioning cellular automaton is a special kind of cellular automaton in which the lattice of cells is divided into non-overlapping blocks and the transition rule is applied to a whole block at a time rather than a single cell. Block cellular automata are useful for simulations of physical quantities, because it is straightforward to choose transition rules that obey physical constraints such as reversibility and conservation laws.

Artificial creation is a field of research that studies the primary synthesis of complex lifelike structures from primordial lifeless origins.

Life simulation games form a subgenre of simulation video games in which the player lives or controls one or more virtual characters. Such a game can revolve around "individuals and relationships, or it could be a simulation of an ecosystem". Other terms include artificial life game and simulated life game (SLG).

Langtons loops Self-reproducing patterns in a particular cellular automaton rule, investigated in 1984 by Christopher Langton

Langton's loops are a particular "species" of artificial life in a cellular automaton created in 1984 by Christopher Langton. They consist of a loop of cells containing genetic information, which flows continuously around the loop and out along an "arm", which will become the daughter loop. The "genes" instruct it to make three left turns, completing the loop, which then disconnects from its parent.

The idea of human artifacts being given life has fascinated humankind for as long as people have been recording their myths and stories. Whether Pygmalion or Frankenstein, humanity has been fascinated with the idea of artificial life.

Von Neumann universal constructor a self-replicating pattern in a cellular automaton, designed by John von Neumann

John von Neumann's universal constructor is a self-replicating machine in a cellular automata (CA) environment. It was designed in the 1940s, without the use of a computer. The fundamental details of the machine were published in von Neumann's book Theory of Self-Reproducing Automata, completed in 1966 by Arthur W. Burks after von Neumann's death.

Agent-based social simulation consists of social simulations that are based on agent-based modeling, and implemented using artificial agent technologies. Agent-based social simulation is scientific discipline concerned with simulation of social phenomena, using computer-based multiagent models. In these simulations, persons or group of persons are represented by agents. MABSS is combination of social science, multiagent simulation and computer simulation.

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

Complex systems biology branch or subfield of mathematical and theoretical biology

Complex systems biology (CSB) is a branch or subfield of mathematical and theoretical biology concerned with complexity of both structure and function in biological organisms, as well as the emergence and evolution of organisms and species, with emphasis being placed on the complex interactions of, and within, bionetworks, and on the fundamental relations and relational patterns that are essential to life. CSB is thus a field of theoretical sciences aimed at discovering and modeling the relational patterns essential to life that has only a partial overlap with complex systems theory, and also with the systems approach to biology called systems biology; this is because the latter is restricted primarily to simplified models of biological organization and organisms, as well as to only a general consideration of philosophical or semantic questions related to complexity in biology. Moreover, a wide range of abstract theoretical complex systems are studied as a field of applied mathematics, with or without relevance to biology, chemistry or physics.

Luis M. Rocha Portuguese-American scientist

Luis M. Rocha is a Professor and director of the NSF-NRT Complex Networks and Systems graduate Program in Informatics, member of the Indiana University Network Science Institute, and core faculty of the Cognitive Science Program at Indiana University, Bloomington, USA. He is a Fulbright Scholar and is also the director of the Computational Biology Collaboratorium and in the Direction of the PhD program in Computational Biology at the Instituto Gulbenkian de Ciencia, Portugal. His research is on complex systems and networks, computational and systems biology, and computational intelligence.


  1. " definition" . Retrieved 2007-01-19.
  2. The MIT Encyclopedia of the Cognitive Sciences, The MIT Press, p.37. ISBN   978-0-262-73144-7
  3. "The Game Industry's Dr. Frankenstein". Next Generation . No. 35. Imagine Media. November 1997. p. 10.
  4. Mark A. Bedau (November 2003). "Artificial life: organization, adaptation and complexity from the bottom up" (PDF). Trends in Cognitive Sciences. Archived from the original (PDF) on 2008-12-02. Retrieved 2007-01-19.
  5. Maciej Komosinski and Andrew Adamatzky (2009). Artificial Life Models in Software. New York: Springer. ISBN   978-1-84882-284-9.
  6. Andrew Adamatzky and Maciej Komosinski (2009). Artificial Life Models in Hardware. New York: Springer. ISBN   978-1-84882-529-1.
  7. Langton, Christopher. "What is Artificial Life?". Archived from the original on 2007-01-17. Retrieved 2007-01-19.
  8. Aguilar, W., Santamaría-Bonfil, G., Froese, T., and Gershenson, C. (2014). The past, present, and future of artificial life. Frontiers in Robotics and AI, 1(8).
  9. See Langton, C. G. 1992. Artificial Life Archived March 11, 2007, at the Wayback Machine . Addison-Wesley. ., section 1
  10. See Red'ko, V. G. 1999. Mathematical Modeling of Evolution. in: F. Heylighen, C. Joslyn and V. Turchin (editors): Principia Cybernetica Web (Principia Cybernetica, Brussels). For the importance of ALife modeling from a cosmic perspective, see also Vidal, C. 2008.The Future of Scientific Simulations: from Artificial Life to Artificial Cosmogenesis. In Death And Anti-Death, ed. Charles Tandy, 6: Thirty Years After Kurt Gödel (1906-1978) p. 285-318. Ria University Press.)
  11. Ray, Thomas (1991). Taylor, C. C.; Farmer, J. D.; Rasmussen, S (eds.). "An approach to the synthesis of life". Artificial Life II, Santa Fe Institute Studies in the Sciences of Complexity. XI: 371–408. Archived from the original on 2015-07-11. Retrieved 24 January 2016. The intent of this work is to synthesize rather than simulate life.
  12. Kalmykov, Lev V.; Kalmykov, Vyacheslav L. (2015), "A Solution to the Biodiversity Paradox by Logical Deterministic Cellular Automata", Acta Biotheoretica, 63 (2): 1–19, doi:10.1007/s10441-015-9257-9, PMID   25980478
  13. 1 2 Kalmykov, Lev V.; Kalmykov, Vyacheslav L. (2015), "A white-box model of S-shaped and double S-shaped single-species population growth", PeerJ, 3:e948: e948, doi:10.7717/peerj.948, PMC   4451025 , PMID   26038717
  14. Zimmer, Carl (15 May 2019). "Scientists Created Bacteria With a Synthetic Genome. Is This Artificial Life? - In a milestone for synthetic biology, colonies of E. coli thrive with DNA constructed from scratch by humans, not nature". The New York Times . Retrieved 16 May 2019.
  15. Fredens, Julius; et al. (15 May 2019). "Total synthesis of Escherichia coli with a recoded genome". Nature . 569: 514–518. doi:10.1038/s41586-019-1192-5. PMID   31092918 . Retrieved 16 May 2019.
  16. "Libarynth" . Retrieved 2015-05-11.
  17. "Caltech" (PDF). Retrieved 2015-05-11.
  18. "AI Beyond Computer Games". Archived from the original on 2008-07-01. Retrieved 2008-07-04.
  19. Horgan, J. 1995. From Complexity to Perplexity. Scientific American. p107