Artificial life (often abbreviated ALife or A-Life) is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry.The discipline was named by Christopher Langton, an American theoretical biologist, in 1986. In 1987 Langton organized the first conference on the field, in Los Alamos, New Mexico. There are three main kinds of alife, named for their approaches: soft, from software; hard, from hardware; and wet , from biochemistry. Artificial life researchers study traditional biology by trying to recreate aspects of biological phenomena.
Artificial life studies the fundamental processes of living systems in artificial environments in order to gain a deeper understanding of the complex information processing that define such systems. These topics are broad, but often include evolutionary dynamics, emergent properties of collective systems, biomimicry, as well as related issues about the philosophy of the nature of life and the use of lifelike properties in artistic works.
The modeling philosophy of artificial life strongly differs from traditional modeling by studying not only "life-as-we-know-it" but also "life-as-it-might-be".
A traditional model of a biological system will focus on capturing its most important parameters. In contrast, an alife modeling approach will generally seek to decipher the most simple and general principles underlying life and implement them in a simulation. The simulation then offers the possibility to analyse new and different lifelike systems.
Vladimir Georgievich Red'ko proposed to generalize this distinction to the modeling of any process, leading to the more general distinction of "processes-as-we-know-them" and "processes-as-they-could-be".
At present, the commonly accepted definition of life does not consider any current alife simulations or software to be alive, and they do not constitute part of the evolutionary process of any ecosystem. However, different opinions about artificial life's potential have arisen:
This is a list of artificial life/digital organism simulators, organized by the method of creature definition.
|ApeSDK (formerly Noble Ape)||language/social simulation||1996||ongoing|
|Creatures||neural net/simulated biochemistry||1996-2001||Fandom still active to this day, some abortive attempts at new products|
|EcoSim||Fuzzy Cognitive Map||2009||ongoing|
|Primordial Life||executable DNA||1994||2003|
|3D Virtual Creature Evolution||neural net||2008||NA|
Program-based simulations contain organisms with a complex DNA language, usually Turing complete. This language is more often in the form of a computer program than actual biological DNA. Assembly derivatives are the most common languages used. An organism "lives" when its code is executed, and there are usually various methods allowing self-replication. Mutations are generally implemented as random changes to the code. Use of cellular automata is common but not required. Another example could be an artificial intelligence and multi-agent system/program.
Individual modules are added to a creature. These modules modify the creature's behaviors and characteristics either directly, by hard coding into the simulation (leg type A increases speed and metabolism), or indirectly, through the emergent interactions between a creature's modules (leg type A moves up and down with a frequency of X, which interacts with other legs to create motion). Generally these are simulators which emphasize user creation and accessibility over mutation and evolution.
Organisms are generally constructed with pre-defined and fixed behaviors that are controlled by various parameters that mutate. That is, each organism contains a collection of numbers or other finite parameters. Each parameter controls one or several aspects of an organism in a well-defined way.
These simulations have creatures that learn and grow using neural nets or a close derivative. Emphasis is often, although not always, more on learning than on natural selection.
Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models).In black-box models, the individual-based (mechanistic) mechanisms of a complex dynamic system remain hidden.
Black-box models are completely nonmechanistic. They are phenomenological and ignore a composition and internal structure of a complex system. We cannot investigate interactions of subsystems of such a non-transparent model. A white-box model of complex dynamic system has ‘transparent walls’ and directly shows underlying mechanisms. All events at micro-, meso- and macro-levels of a dynamic system are directly visible at all stages of its white-box model evolution. In most cases mathematical modelers use the heavy black-box mathematical methods, which cannot produce mechanistic models of complex dynamic systems. Grey-box models are intermediate and combine black-box and white-box approaches.
Creation of a white-box model of complex system is associated with the problem of the necessity of an a priori basic knowledge of the modeling subject. The deterministic logical cellular automata are necessary but not sufficient condition of a white-box model. The second necessary prerequisite of a white-box model is the presence of the physical ontology of the object under study. The white-box modeling represents an automatic hyper-logical inference from the first principles because it is completely based on the deterministic logic and axiomatic theory of the subject. The purpose of the white-box modeling is to derive from the basic axioms a more detailed, more concrete mechanistic knowledge about the dynamics of the object under study. The necessity to formulate an intrinsic axiomatic system of the subject before creating its white-box model distinguishes the cellular automata models of white-box type from cellular automata models based on arbitrary logical rules. If cellular automata rules have not been formulated from the first principles of the subject, then such a model may have a weak relevance to the real problem.
Hardware-based artificial life mainly consist of robots, that is, automatically guided machines able to do tasks on their own.
Biochemical-based life is studied in the field of synthetic biology. It involves e.g. the creation of synthetic DNA. The term "wet" is an extension of the term "wetware".
In May 2019, researchers, in a milestone effort, reported the creation of a new synthetic (possibly artificial) form of viable life, a variant of the bacteria Escherichia coli , by reducing the natural number of 64 codons in the bacterial genome to 59 codons instead, in order to encode 20 amino acids.
Alife has had a controversial history. John Maynard Smith criticized certain artificial life work in 1994 as "fact-free science".
The Game of Life, also known simply as Life, is a cellular automaton devised by the British mathematician John Horton Conway in 1970.
Langton's ant is a two-dimensional universal Turing machine with a very simple set of rules but complex emergent behavior. It was invented by Chris Langton in 1986 and runs on a square lattice of black and white cells. The universality of Langton's ant was proven in 2000. The idea has been generalized in several different ways, such as turmites which add more colors and more states.
Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.
Social simulation is a research field that applies computational methods to study issues in the social sciences. The issues explored include problems in computational law, psychology, organizational behavior, sociology, political science, economics, anthropology, geography, engineering, archaeology and linguistics.
An agent-based model (ABM) is a class of computational models for simulating the actions and interactions of autonomous agents with a view to assessing their effects on the system as a whole. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to introduce randomness. Particularly within ecology, ABMs are also called individual-based models (IBMs), and individuals within IBMs may be simpler than fully autonomous agents within ABMs. A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used on non-computing related scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.
Artificial society is the specific agent based computational model for computer simulation in social analysis. It is mostly connected to the theme in complex system, emergence, Monte Carlo method, computational sociology, multi-agent system, and evolutionary programming. The concept itself is simple enough. Actually reaching this conceptual point took a while. Complex mathematical models have been, and are, common; deceivingly simple models only have their roots in the late forties, and took the advent of the microcomputer to really get up to speed.
An artificial chemistry is a chemical-like system that usually consists of objects, called molecules, that interact according to rules resembling chemical reaction rules. Artificial chemistries are created and studied in order to understand fundamental properties of chemical systems, including prebiotic evolution, as well as for developing chemical computing systems. Artificial chemistry is a field within computer science wherein chemical reactions—often biochemical ones—are computer-simulated, yielding insights on evolution, self-assembly, and other biochemical phenomena. The field does not use actual chemicals, and should not be confused with either synthetic chemistry or computational chemistry. Rather, bits of information are used to represent the starting molecules, and the end products are examined along with the processes that led to them. The field originated in artificial life but has shown to be a versatile method with applications in many fields such as chemistry, economics, sociology and linguistics.
Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. An example field of study is how unintended consequences arise in social processes.
In silico is an expression meaning "performed on computer or via computer simulation" in reference to biological experiments. The phrase was coined in 1987 as an allusion to the Latin phrases in vivo, in vitro, and in situ, which are commonly used in biology and refer to experiments done in living organisms, outside living organisms, and where they are found in nature, respectively.
Christopher Gale Langton is an American computer scientist and one of the founders of the field of artificial life. He coined the term in the late 1980s when he organized the first "Workshop on the Synthesis and Simulation of Living Systems" at the Los Alamos National Laboratory in 1987. Following his time at Los Alamos, Langton joined the Santa Fe Institute (SFI), to continue his research on artificial life. He left SFI in the late 1990s, and abandoned his work on artificial life, publishing no research since that time.
A block cellular automaton or partitioning cellular automaton is a special kind of cellular automaton in which the lattice of cells is divided into non-overlapping blocks and the transition rule is applied to a whole block at a time rather than a single cell. Block cellular automata are useful for simulations of physical quantities, because it is straightforward to choose transition rules that obey physical constraints such as reversibility and conservation laws.
Artificial creation is a field of research that studies the primary synthesis of complex lifelike structures from primordial lifeless origins.
Life simulation games form a subgenre of simulation video games in which the player lives or controls one or more virtual characters. Such a game can revolve around "individuals and relationships, or it could be a simulation of an ecosystem". Other terms include artificial life game and simulated life game (SLG).
Langton's loops are a particular "species" of artificial life in a cellular automaton created in 1984 by Christopher Langton. They consist of a loop of cells containing genetic information, which flows continuously around the loop and out along an "arm", which will become the daughter loop. The "genes" instruct it to make three left turns, completing the loop, which then disconnects from its parent.
The idea of human artifacts being given life has fascinated humankind for as long as people have been recording their myths and stories. Whether Pygmalion or Frankenstein, humanity has been fascinated with the idea of artificial life.
John von Neumann's universal constructor is a self-replicating machine in a cellular automata (CA) environment. It was designed in the 1940s, without the use of a computer. The fundamental details of the machine were published in von Neumann's book Theory of Self-Reproducing Automata, completed in 1966 by Arthur W. Burks after von Neumann's death.
Agent-based social simulation consists of social simulations that are based on agent-based modeling, and implemented using artificial agent technologies. Agent-based social simulation is scientific discipline concerned with simulation of social phenomena, using computer-based multiagent models. In these simulations, persons or group of persons are represented by agents. MABSS is combination of social science, multiagent simulation and computer simulation.
Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.
Complex systems biology (CSB) is a branch or subfield of mathematical and theoretical biology concerned with complexity of both structure and function in biological organisms, as well as the emergence and evolution of organisms and species, with emphasis being placed on the complex interactions of, and within, bionetworks, and on the fundamental relations and relational patterns that are essential to life. CSB is thus a field of theoretical sciences aimed at discovering and modeling the relational patterns essential to life that has only a partial overlap with complex systems theory, and also with the systems approach to biology called systems biology; this is because the latter is restricted primarily to simplified models of biological organization and organisms, as well as to only a general consideration of philosophical or semantic questions related to complexity in biology. Moreover, a wide range of abstract theoretical complex systems are studied as a field of applied mathematics, with or without relevance to biology, chemistry or physics.
Luis M. Rocha is a Professor and director of the NSF-NRT Complex Networks and Systems graduate Program in Informatics, member of the Indiana University Network Science Institute, and core faculty of the Cognitive Science Program at Indiana University, Bloomington, USA. He is a Fulbright Scholar and is also the director of the Computational Biology Collaboratorium and in the Direction of the PhD program in Computational Biology at the Instituto Gulbenkian de Ciencia, Portugal. His research is on complex systems and networks, computational and systems biology, and computational intelligence.
The intent of this work is to synthesize rather than simulate life.