Generative science

Last updated
Interaction between a few simple rules and parameters can produce endless, seemingly unpredictable complexity. Game of life torus 100 100 1500.gif
Interaction between a few simple rules and parameters can produce endless, seemingly unpredictable complexity.

Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". [1] By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. [2] An example field of study is how unintended consequences arise in social processes.

Contents

Generative sciences often explore natural phenomena at several levels of organization. [3] [4] Self-organizing natural systems are a central subject, studied both theoretically and by simulation experiments. The study of complex systems in general has been grouped under the heading of "general systems theory", particularly by Ludwig von Bertalanffy, Anatol Rapoport, Ralph Gerard, and Kenneth Boulding.

Scientific and philosophical origins

Turbulence in the tip vortex from an airplane wing. Studies of the critical point beyond which a system creates turbulence were important for chaos theory, analyzed for example by the Soviet physicist Lev Landau who developed the Landau-Hopf theory of turbulence. David Ruelle and Floris Takens later predicted, against Landau, that fluid turbulence could develop through a strange attractor, a main concept of chaos theory. Airplane vortex edit.jpg
Turbulence in the tip vortex from an airplane wing. Studies of the critical point beyond which a system creates turbulence were important for chaos theory, analyzed for example by the Soviet physicist Lev Landau who developed the Landau-Hopf theory of turbulence. David Ruelle and Floris Takens later predicted, against Landau, that fluid turbulence could develop through a strange attractor, a main concept of chaos theory.
Computer simulation of the branching architecture of the dendrites of pyramidal neurons. Forest of synthetic pyramidal dendrites grown using Cajal's laws of neuronal branching.png
Computer simulation of the branching architecture of the dendrites of pyramidal neurons.
The natural phenomenon of herd behaviour as in a flock of birds can be modelled artificially using simple rules in individual units, with swarm intelligence rather than any centralized control. Auklet flock Shumagins 1986.jpg
The natural phenomenon of herd behaviour as in a flock of birds can be modelled artificially using simple rules in individual units, with swarm intelligence rather than any centralized control.

The development of computers and automata theory laid a technical foundation for the growth of the generative sciences. For example:

One of the most influential advances in the generative sciences as related to cognitive science came from Noam Chomsky's (1957) development of generative grammar, which separated language generation from semantic content, and thereby revealed important questions about human language. It was also in the early 1950s that psychologists at the MIT including Kurt Lewin, Jacob Levy Moreno and Fritz Heider laid the foundations for group dynamics research which later developed into social network analysis.

See also

Related Research Articles

<span class="mw-page-title-main">Computer science</span> Study of computation

Computer science is the study of computation, information, and automation. Computer science spans theoretical disciplines to applied disciplines. Though more often considered an academic discipline, computer science is closely related to computer programming.

<span class="mw-page-title-main">Cellular automaton</span> Discrete model studied in computer science

A cellular automaton is a discrete model of computation studied in automata theory. Cellular automata are also called cellular spaces, tessellation automata, homogeneous structures, cellular structures, tessellation structures, and iterative arrays. Cellular automata have found application in various areas, including physics, theoretical biology and microstructure modeling.

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

<span class="mw-page-title-main">Theoretical computer science</span> Subfield of computer science and mathematics

Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, formal language theory, the lambda calculus and type theory.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

Social simulation is a research field that applies computational methods to study issues in the social sciences. The issues explored include problems in computational law, psychology, organizational behavior, sociology, political science, economics, anthropology, geography, engineering, archaeology and linguistics.

<span class="mw-page-title-main">Social complexity</span> Conceptual framework

In sociology, social complexity is a conceptual framework used in the analysis of society. In the sciences, contemporary definitions of complexity are found in systems theory, wherein the phenomenon being studied has many parts and many possible arrangements of the parts; simultaneously, what is complex and what is simple are relative and change in time.

<span class="mw-page-title-main">Computational sociology</span> Branch of the discipline of sociology

Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.

An artificial society is an agent-based computational model for computer simulation in social analysis. It is mostly connected to the themes of complex systems, emergence, the Monte Carlo method, computational sociology, multi-agent systems, and evolutionary programming. While the concept was simple, actually realizing this conceptual point took a while. Complex mathematical models have been, and are, common; deceivingly simple models only have their roots in the late forties, and took the advent of the microcomputer to really get up to speed.

<span class="mw-page-title-main">Neural network</span> Structure in biology and artificial intelligence

A neural network can refer to either a neural circuit of biological neurons, or a network of artificial neurons or nodes in the case of an artificial neural network. Artificial neural networks are used for solving artificial intelligence (AI) problems; they model connections of biological neurons as weights between nodes. A positive weight reflects an excitatory connection, while negative values mean inhibitory connections. All inputs are modified by a weight and summed. This activity is referred to as a linear combination. Finally, an activation function controls the amplitude of the output. For example, an acceptable range of output is usually between 0 and 1, or it could be −1 and 1.

Cellular automata, as with other multi-agent system models, usually treat time as discrete and state updates as occurring synchronously. The state of every cell in the model is updated together, before any of the new states influence other cells. In contrast, an asynchronous cellular automaton is able to update individual cells independently, in such a way that the new state of a cell affects the calculation of states in neighbouring cells.

Joshua Morris Epstein is Professor of Epidemiology at the New York University College of Global Public Health. Formerly Professor of Emergency Medicine at Johns Hopkins University, with joint appointments in the departments of Applied Mathematics, Economics, Biostatistics, International Health, and Environmental Health Sciences and the Director of the JHU Center for Advanced Modeling in the Social, Behavioral, and Health Sciences. He is an External Professor at the Santa Fe Institute, a member of the New York Academy of Sciences, and a member of the Institute of Medicine's Committee on Identifying and Prioritizing New Preventive Vaccines.

Agent-based social simulation consists of social simulations that are based on agent-based modeling, and implemented using artificial agent technologies. Agent-based social simulation is a scientific discipline concerned with simulation of social phenomena, using computer-based multiagent models. In these simulations, persons or group of persons are represented by agents. MABSS is a combination of social science, multiagent simulation and computer simulation.

Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In other countries, the term "informatics" is used with a different meaning in the context of library science, in which case it is synonymous with data storage and retrieval.

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

<span class="mw-page-title-main">Artificial life</span> Field of study

Artificial life is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry. The discipline was named by Christopher Langton, an American theoretical biologist, in 1986. In 1987 Langton organized the first conference on the field, in Los Alamos, New Mexico. There are three main kinds of alife, named for their approaches: soft, from software; hard, from hardware; and wet, from biochemistry. Artificial life researchers study traditional biology by trying to recreate aspects of biological phenomena.

Artificial Economics can be defined as ″a research field that aims at improving our understanding of socioeconomic processes with the help of computer simulation″. Like in Theoretical Economics, the approach followed in Artificial Economics to gain understanding of socioeconomic processes involves building and analysing formal models. However, in contrast with Theoretical Economics, models in Artificial Economics are implemented in a programming language so that computers can be employed to analyse them.

<span class="mw-page-title-main">Glossary of artificial intelligence</span> List of definitions of terms and concepts commonly used in the study of artificial intelligence

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

<span class="mw-page-title-main">Outline of machine learning</span> Overview of and topical guide to machine learning

The following outline is provided as an overview of and topical guide to machine learning. Machine learning is a subfield of soft computing within computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. In 1959, Arthur Samuel defined machine learning as a "field of study that gives computers the ability to learn without being explicitly programmed". Machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions.

References

  1. Gordana Dodig-Crnkovic; Raffaela Giovagnoli (2013), "Computing Nature – A Network of Networks of Concurrent Information Processes", in Gordana Dodig-Crnkovic; Raffaela Giovagnoli (eds.), Computing nature: Turing centenary perspective, Springer, p. 7, ISBN   978-3-642-37225-4
  2. Ning Nan; Erik W. Johnston; Judith S. Olson (2008), "Unintended consequences of collocation: using agent-based modeling to untangle effects of communication delay and in-group favor", Computational and Mathematical Organization Theory, 14 (2): 57–83, doi:10.1007/s10588-008-9024-4, S2CID   397177
  3. Farre, G. L. (1997). "The Energetic Structure of Observation: A Philosophical Disquisition". American Behavioral Scientist. 40 (6): 717–728. doi:10.1177/0002764297040006004. S2CID   144764570.
  4. J. Schmidhuber. (1997) A computer scientist's view of life, the universe, and everything. Foundations of Computer Science: Potential – Theory – Cognition, Lecture Notes in Computer Science, pages 201–208, Springer
  5. Hermann Cuntz (2010). "PLoS Computational Biology Issue Image | Vol. 6(8) August 2010". PLOS Computational Biology. 6 (8): ev06.ei08. doi: 10.1371/image.pcbi.v06.i08 .
  6. Kenrick, DT; Li, NP; Butner, J (2003). "Dynamical evolutionary psychology: individual decision rules and emergent social norms". Psychological Review. 110 (1): 3–28. CiteSeerX   10.1.1.526.5218 . doi:10.1037/0033-295X.110.1.3. PMID   12529056. S2CID   43306158.
  7. Epstein, Joshua M.; Axtell, Robert L. (1996). Growing Artificial Societies: Social Science From the Bottom Up . Cambridge MA: MIT/Brookings Institution. p.  224. ISBN   978-0-262-55025-3.
  8. Nowak A.; Vallacher R.R.; Tesser A.; Borkowski W. (2000), "Society of Self: The emergence of collective properties in self-structure", Psychological Review, 107 (1): 39–61, doi:10.1037/0033-295x.107.1.39, PMID   10687402
  9. Epstein J.M. (1999), "Agent Based Computational Models and Generative Social Science", Complexity, 4 (5): 41–60, Bibcode:1999Cmplx...4e..41E, CiteSeerX   10.1.1.353.5950 , doi:10.1002/(SICI)1099-0526(199905/06)4:5<41::AID-CPLX9>3.0.CO;2-F
  10. John Conway's Game of Life