Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". [1] By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. [2] An example field of study is how unintended consequences arise in social processes.
Generative sciences often explore natural phenomena at several levels of organization. [3] [4] Self-organizing natural systems are a central subject, studied both theoretically and by simulation experiments. The study of complex systems in general has been grouped under the heading of "general systems theory", particularly by Ludwig von Bertalanffy, Anatol Rapoport, Ralph Gerard, and Kenneth Boulding.
The development of computers and automata theory laid a technical foundation for the growth of the generative sciences. For example:
One of the most influential advances in the generative sciences as related to cognitive science came from Noam Chomsky's (1957) development of generative grammar, which separated language generation from semantic content, and thereby revealed important questions about human language. It was also in the early 1950s that psychologists at the MIT including Kurt Lewin, Jacob Levy Moreno and Fritz Heider laid the foundations for group dynamics research which later developed into social network analysis.
Computer science is the study of computation, information, and automation. Computer science spans theoretical disciplines to applied disciplines.
A cellular automaton is a discrete model of computation studied in automata theory. Cellular automata are also called cellular spaces, tessellation automata, homogeneous structures, cellular structures, tessellation structures, and iterative arrays. Cellular automata have found application in various areas, including physics, theoretical biology and microstructure modeling.
Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.
In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.
Theoretical computer science is a subfield of computer science and mathematics that focuses on the abstract and mathematical foundations of computation.
Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.
Social simulation is a research field that applies computational methods to study issues in the social sciences. The issues explored include problems in computational law, psychology, organizational behavior, sociology, political science, economics, anthropology, geography, engineering, archaeology and linguistics.
In sociology, social complexity is a conceptual framework used in the analysis of society. In the sciences, contemporary definitions of complexity are found in systems theory, wherein the phenomenon being studied has many parts and many possible arrangements of the parts; simultaneously, what is complex and what is simple are relative and change in time.
Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.
An agent-based model (ABM) is a computational model for simulating the actions and interactions of autonomous agents in order to understand the behavior of a system and what governs its outcomes. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to understand the stochasticity of these models. Particularly within ecology, ABMs are also called individual-based models (IBMs). A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used in many scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.
An artificial society is an agent-based computational model for computer simulation in social analysis. It is mostly connected to the themes of complex systems, emergence, the Monte Carlo method, computational sociology, multi-agent systems, and evolutionary programming. While the concept was simple, actually realizing this conceptual point took a while. Complex mathematical models have been, and are, common; deceivingly simple models only have their roots in the late forties, and took the advent of the microcomputer to really get up to speed.
A computational model uses computer programs to simulate and study complex systems using an algorithmic or mechanistic approach and is widely used in a diverse range of fields spanning from physics, engineering, chemistry and biology to economics, psychology, cognitive science and computer science.
Cellular automata, as with other multi-agent system models, usually treat time as discrete and state updates as occurring synchronously. The state of every cell in the model is updated together, before any of the new states influence other cells. In contrast, an asynchronous cellular automaton is able to update individual cells independently, in such a way that the new state of a cell affects the calculation of states in neighbouring cells.
Joshua Morris Epstein is Professor of Epidemiology at the New York University College of Global Public Health. Formerly Professor of Emergency Medicine at Johns Hopkins University, with joint appointments in the departments of Applied Mathematics, Economics, Biostatistics, International Health, and Environmental Health Sciences and the Director of the JHU Center for Advanced Modeling in the Social, Behavioral, and Health Sciences. He is an External Professor at the Santa Fe Institute, a member of the New York Academy of Sciences, and a member of the Institute of Medicine's Committee on Identifying and Prioritizing New Preventive Vaccines.
Agent-based social simulation consists of social simulations that are based on agent-based modeling, and implemented using artificial agent technologies. Agent-based social simulation is a scientific discipline concerned with simulation of social phenomena, using computer-based multiagent models. In these simulations, persons or group of persons are represented by agents. MABSS is a combination of social science, multiagent simulation and computer simulation.
Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In some cases, the term "informatics" may also be used with different meanings, e.g. in the context of social computing, or in context of library science.
Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.
Artificial life is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry. The discipline was named by Christopher Langton, an American computer scientist, in 1986. In 1987, Langton organized the first conference on the field, in Los Alamos, New Mexico. There are three main kinds of alife, named for their approaches: soft, from software; hard, from hardware; and wet, from biochemistry. Artificial life researchers study traditional biology by trying to recreate aspects of biological phenomena.
This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence (AI), its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.