Artificial society

Last updated

An artificial society is an agent-based computational model for computer simulation in social analysis. It is mostly connected to the themes of complex systems, emergence, the Monte Carlo method, computational sociology, multi-agent systems, and evolutionary programming. While the concept was simple, actually realizing this conceptual point took a while. Complex mathematical models have been, and are, common; deceivingly simple models only have their roots in the late forties, and took the advent of the microcomputer to really get up to speed.

Contents

Overview

The aim is to construct parallel simulations consisting of computational devices, referred to as agents, with given properties, in order to model the target phenomena. The subject is the process of emergence from the lower (micro) level of a social system to the higher (or macro) level.

The history of agent-based modeling can be traced back to Von Neumann machines, the concept of a machine capable of reproduction. The device he proposed would follow precisely detailed instructions to fashion a copy of itself. The concept was then extended by von Neumann's friend Stanislaw Ulam, also a mathematician, who suggested that the machine be built on paper, as a collection of cells on a grid. The idea intrigued von Neumann, who drew it up, thus creating the first of the devices later termed cellular automata.

A further advance was achieved by mathematician John Conway. He constructed the well-known game of life. Unlike von Neumann's machine, Conway's Game of Life operated according to tremendously simple rules in a virtual world in the form of a 2-dimensional checkerboard.

The application of the agent-based model as a social model was primarily initiated by computer scientist Craig Reynolds. He attempted to model living biological agents, a method known as artificial life, a term coined by Christopher Langton.

The computational methods of artificial life were applied to the analysis of social systems, christened "the artificial society" by Joshua M. Epstein and Robert Axtell. [1] Eventually, the artificial society provided a new method for sociological analysis in the form of computational sociology. The principal problem is that of classical sociology, the issue of macro-micro linkage: as first articulated by French Sociologist Émile Durkheim, the question of how individuals within a social system influence and are influenced by the macrosocial level.

The artificial society has been widely accepted by recent sociology as a promising method characterized by the extensive use of computer programs and computer simulations which include evolutionary algorithms (EA), genetic algorithms (GA), genetic programming (GP), memetic programming (MP), agent based models, and cellular automata (CA).

For many, artificial society is a meeting point for people from many other more traditional fields in interdisciplinary research, such as linguistics, social physics, mathematics, philosophy, law, computer science, biology, and sociology in which unusual computational and theoretical approaches that would be controversial within their native discipline can be discussed. As a field, it has had a controversial history; some have characterized it as "practical theology" or a "fact-free science". However, the recent publication of artificial society articles in the scientific journals e.g.: Journal of Artificial Societies and Social Simulation and Journal of Social Complexity shows that artificial life techniques are becoming somewhat more accepted within the sociological mainstream.

See also

Related Research Articles

Cellular automaton Discrete model studied in computer science

A cellular automaton is a discrete model of computation studied in automata theory. Cellular automata are also called cellular spaces, tessellation automata, homogeneous structures, cellular structures, tessellation structures, and iterative arrays. Cellular automata have found application in various areas, including physics, theoretical biology and microstructure modeling.

Evolutionary computation Trial and error problem solvers with a metaheuristic or stochastic optimization character

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

Social simulation is a research field that applies computational methods to study issues in the social sciences. The issues explored include problems in computational law, psychology, organizational behavior, sociology, political science, economics, anthropology, geography, engineering, archaeology and linguistics.

In sociology, social complexity is a conceptual framework used in the analysis of society. Contemporary definitions of complexity in the sciences are found in relation to systems theory, in which a phenomenon under study has many parts and many possible arrangements of the relationships between those parts. At the same time, what is complex and what is simple is relative and may change with time.

Computational sociology Branch of the discipline of sociology

Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.

An agent-based model (ABM) is a computational model for simulating the actions and interactions of autonomous agents in order to understand the behavior of a system and what governs its outcomes. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to understand the stochasticity of these models. Particularly within ecology, ABMs are also called individual-based models (IBMs). A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used in many scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.

Generative science Study of how complex behaviour can be generated by deterministic and finite rules and parameters

Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. An example field of study is how unintended consequences arise in social processes.

Human-based computation (HBC), human-assisted computation, ubiquitous human computing or distributed thinking is a computer science technique in which a machine performs its function by outsourcing certain steps to humans, usually as microwork. This approach uses differences in abilities and alternative costs between humans and computer agents to achieve symbiotic human–computer interaction. For computationally difficult tasks such as image recognition, human-based computation plays a central role in training Deep Learning-based Artificial Intelligence systems. In this case, human-based computation has been referred to as human-aided artificial intelligence.

Computer simulation is a prominent method in organizational studies and strategic management. While there are many uses for computer simulation, most academics in the fields of strategic management and organizational studies have used computer simulation to understand how organizations or firms operate. More recently, however, researchers have also started to apply computer simulation to understand organizational behaviour at a more micro-level, focusing on individual and interpersonal cognition and behavior such as team working.

The idea of human artifacts being given life has fascinated humankind for at least 3000 years. As seen in tales ranging from Pygmalion to Frankenstein, humanity has long been intrigued by the concept of artificial life.

Von Neumann universal constructor Self replicating cellular automata

John von Neumann's universal constructor is a self-replicating machine in a cellular automata (CA) environment. It was designed in the 1940s, without the use of a computer. The fundamental details of the machine were published in von Neumann's book Theory of Self-Reproducing Automata, completed in 1966 by Arthur W. Burks after von Neumann's death. While typically not as well known as von Neumann's other work, it is regarded as foundational for automata theory, complex systems, and artificial life. Indeed, Nobel Laureate Sydney Brenner considered Von Neumann's work on self-reproducing automata central to biological theory as well, allowing us to "discipline our thoughts about machines, both natural and artificial."

Cellular automata, as with other multi-agent system models, usually treat time as discrete and state updates as occurring synchronously. The state of every cell in the model is updated together, before any of the new states influence other cells. In contrast, an asynchronous cellular automaton is able to update individual cells independently, in such a way that the new state of a cell affects the calculation of states in neighbouring cells.

Joshua M. Epstein is Professor of Epidemiology at the New York University College of Global Public Health. Formerly Professor of Emergency Medicine at Johns Hopkins University, with joint appointments in the departments of Applied Mathematics, Economics, Biostatistics, International Health, and Environmental Health Sciences and the Director of the JHU Center for Advanced Modeling in the Social, Behavioral, and Health Sciences. He is an External Professor at the Santa Fe Institute, a member of the New York Academy of Sciences, and a member of the Institute of Medicine's Committee on Identifying and Prioritizing New Preventive Vaccines.

Agent-based social simulation consists of social simulations that are based on agent-based modeling, and implemented using artificial agent technologies. Agent-based social simulation is a scientific discipline concerned with simulation of social phenomena, using computer-based multiagent models. In these simulations, persons or group of persons are represented by agents. MABSS is a combination of social science, multiagent simulation and computer simulation.

Nigel Gilbert

Geoffrey Nigel Gilbert is a British sociologist and a pioneer in the use of agent-based models in the social sciences. He is the founder and director of the Centre for Research in Social Simulation, author of several books on computational social science, social simulation and social research and past editor of the Journal of Artificial Societies and Social Simulation (JASSS), the leading journal in the field.

Sugarscape is a model for artificially intelligent agent-based social simulation following some or all rules presented by Joshua M. Epstein & Robert Axtell in their book Growing Artificial Societies.

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

Robert Axtell is a Professor at George Ma son University, Krasnow Institute for Advanced Study, where he is Departmental Chair of the Department of Computational Social Science. He is also a member of the External Faculty of the Santa Fe Institute. Axtell is also the co-Director of the new Computational Public Policy Lab at Mason.

Artificial life Field of study

Artificial life is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry. The discipline was named by Christopher Langton, an American theoretical biologist, in 1986. In 1987 Langton organized the first conference on the field, in Los Alamos, New Mexico. There are three main kinds of alife, named for their approaches: soft, from software; hard, from hardware; and wet, from biochemistry. Artificial life researchers study traditional biology by trying to recreate aspects of biological phenomena.

References

  1. Epstein, Joshua M.; Axtell, Robert L. (1996). Growing Artificial Societies: Social Science From the Bottom Up . Cambridge MA: MIT/Brookings Institution. pp.  224. ISBN   978-0-262-55025-3.