Computer simulation

Last updated
A 48-hour computer simulation of Typhoon Mawar using the Weather Research and Forecasting model Typhoon Mawar 2005 computer simulation thumbnail.gif
A 48-hour computer simulation of Typhoon Mawar using the Weather Research and Forecasting model
Process of building a computer model, and the interplay between experiment, simulation, and theory Molecular simulation process.svg
Process of building a computer model, and the interplay between experiment, simulation, and theory

Computer simulation is the process of mathematical modelling, performed on a computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they aim to predict. Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in economics, psychology, social science, health care and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions. [1]

Contents

Computer simulations are realized by running computer programs that can be either small, running almost instantly on small devices, or large-scale programs that run for hours or days on network-based groups of computers. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. In 1997, a desert-battle simulation of one force invading another involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer Modernization Program. [2] Other examples include a 1-billion-atom model of material deformation; [3] a 2.64-million-atom model of the complex protein-producing organelle of all living organisms, the ribosome, in 2005; [4] a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL (Switzerland), begun in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level. [5]

Because of the computational cost of simulation, computer experiments are used to perform inference such as uncertainty quantification. [6]

Simulation versus model

A model consists of the equations used to capture the behavior of a system. By contrast, computer simulation is the actual running of the program that perform algorithms which solve those equations, often in an approximate manner. Simulation, therefore, is the process of running a model. Thus one would not "build a simulation"; instead, one would "build a model (or a simulator)", and then either "run the model" or equivalently "run a simulation".

History

Computer simulation developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation. It was a simulation of 12 hard spheres using a Monte Carlo algorithm. Computer simulation is often used as an adjunct to, or substitute for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; their common feature is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible. [7]

Data preparation

The external data requirements of simulations and models vary widely. For some, the input might be just a few numbers (for example, simulation of a waveform of AC electricity on a wire), while others might require terabytes of information (such as weather and climate models).

Input sources also vary widely:

Lastly, the time at which data is available varies:

Because of this variety, and because diverse simulation systems have many common elements, there are a large number of specialized simulation languages. The best-known may be Simula. There are now many others.

Systems that accept data from external sources must be very careful in knowing what they are receiving. While it is easy for computers to read in values from text or binary files, what is much harder is knowing what the accuracy (compared to measurement resolution and precision) of the values are. Often they are expressed as "error bars", a minimum and maximum deviation from the value range within which the true value (is expected to) lie. Because digital computer mathematics is not perfect, rounding and truncation errors multiply this error, so it is useful to perform an "error analysis" [8] to confirm that values output by the simulation will still be usefully accurate.

Types

Models used for computer simulations can be classified according to several independent pairs of attributes, including:

Another way of categorizing models is to look at the underlying data structures. For time-stepped simulations, there are two main classes:

For steady-state simulations, equations define the relationships between elements of the modeled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems, as a simpler modeling case before dynamic simulation is attempted.

Visualization

Formerly, the output data from a computer simulation was sometimes presented in a table or a matrix showing how data were affected by numerous changes in the simulation parameters. The use of the matrix format was related to traditional use of the matrix concept in mathematical models. However, psychologists and others noted that humans could quickly perceive trends by looking at graphs or even moving-images or motion-pictures generated from the data, as displayed by computer-generated-imagery (CGI) animation. Although observers could not necessarily read out numbers or quote math formulas, from observing a moving weather chart they might be able to predict events (and "see that rain was headed their way") much faster than by scanning tables of rain-cloud coordinates. Such intense graphical displays, which transcended the world of numbers and formulae, sometimes also led to output that lacked a coordinate grid or omitted timestamps, as if straying too far from numeric data displays. Today, weather forecasting models tend to balance the view of moving rain/snow clouds against a map that uses numeric coordinates and numeric timestamps of events.

Similarly, CGI computer simulations of CAT scans can simulate how a tumor might shrink or change during an extended period of medical treatment, presenting the passage of time as a spinning view of the visible human head, as the tumor changes.

Other applications of CGI computer simulations are being developed[ as of? ] to graphically display large amounts of data, in motion, as changes occur during a simulation run.

In science

Computer simulation of the process of osmosis Osmosis computer simulation.jpg
Computer simulation of the process of osmosis

Generic examples of types of computer simulations in science, which are derived from an underlying mathematical description:

Specific examples of computer simulations include:

Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows' World3 used in the Limits to Growth , James Lovelock's Daisyworld and Thomas Ray's Tierra.

In social sciences, computer simulation is an integral component of the five angles of analysis fostered by the data percolation methodology, [12] which also includes qualitative and quantitative methods, reviews of the literature (including scholarly), and interviews with experts, and which forms an extension of data triangulation. Of course, similar to any other scientific method, replication is an important part of computational modeling [13]

In practical contexts

Computer simulations are used in a wide variety of practical contexts, such as:

The reliability and the trust people put in computer simulations depends on the validity of the simulation model, therefore verification and validation are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention[ editorializing ] in stochastic simulations, where random numbers should actually be semi-random numbers. An exception to reproducibility are human-in-the-loop simulations such as flight simulations and computer games. Here a human is part of the simulation and thus influences the outcome in a way that is hard, if not impossible, to reproduce exactly.

Vehicle manufacturers make use of computer simulation to test safety features in new designs. By building a copy of the car in a physics simulation environment, they can save the hundreds of thousands of dollars that would otherwise be required to build and test a unique prototype. Engineers can step through the simulation milliseconds at a time to determine the exact stresses being put upon each section of the prototype. [15]

Computer graphics can be used to display the results of a computer simulation. Animations can be used to experience a simulation in real-time, e.g., in training simulations. In some cases animations may also be useful in faster than real-time or even slower than real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization.

In debugging, simulating a program execution under test (rather than executing natively) can detect far more errors than the hardware itself can detect and, at the same time, log useful debugging information such as instruction trace, memory alterations and instruction counts. This technique can also detect buffer overflow and similar "hard to detect" errors as well as produce performance information and tuning data.

Pitfalls

Although sometimes ignored in computer simulations, it is very important[ editorializing ] to perform a sensitivity analysis to ensure that the accuracy of the results is properly understood. For example, the probabilistic risk analysis of factors determining the success of an oilfield exploration program involves combining samples from a variety of statistical distributions using the Monte Carlo method. If, for instance, one of the key parameters (e.g., the net ratio of oil-bearing strata) is known to only one significant figure, then the result of the simulation might not be more precise than one significant figure, although it might (misleadingly) be presented as having four significant figures.

See also

Related Research Articles

<span class="mw-page-title-main">Simulation</span> Imitation of the operation of a real-world process or system over time

A simulation is an imitative representation of a process or system that could exist in the real world. In this broad sense, simulation can often be used interchangeably with model. Sometimes a clear distinction between the two terms is made, in which simulations require the use of models; the model represents the key characteristics or behaviors of the selected system or process, whereas the simulation represents the evolution of the model over time. Another way to distinguish between the terms is to define simulation as experimentation with the help of a model. This definition includes time-independent simulations. Often, computers are used to execute the simulation.

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. The name comes from the Monte Carlo Casino in Monaco, where the primary developer of the method, physicist Stanislaw Ulam, was inspired by his uncle's gambling habits.

<span class="mw-page-title-main">Computational physics</span> Numerical simulations of physical problems via computers

Computational physics is the study and implementation of numerical analysis to solve problems in physics. Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science. It is sometimes regarded as a subdiscipline of theoretical physics, but others consider it an intermediate branch between theoretical and experimental physics — an area of study which supplements both theory and experiment.

A hybrid system is a dynamical system that exhibits both continuous and discrete dynamic behavior – a system that can both flow and jump. Often, the term "hybrid dynamical system" is used, to distinguish over hybrid systems such as those that combine neural nets and fuzzy logic, or electrical and mechanical drivelines. A hybrid system has the benefit of encompassing a larger class of systems within its structure, allowing for more flexibility in modeling dynamic phenomena.

<span class="mw-page-title-main">Mathematical and theoretical biology</span> Branch of biology

Mathematical and theoretical biology, or biomathematics, is a branch of biology which employs theoretical analysis, mathematical models and abstractions of living organisms to investigate the principles that govern the structure, development and behavior of the systems, as opposed to experimental biology which deals with the conduction of experiments to test scientific theories. The field is sometimes called mathematical biology or biomathematics to stress the mathematical side, or theoretical biology to stress the biological side. Theoretical biology focuses more on the development of theoretical principles for biology while mathematical biology focuses on the use of mathematical tools to study biological systems, even though the two terms are sometimes interchanged.

Network traffic simulation is a process used in telecommunications engineering to measure the efficiency of a communications network.

<span class="mw-page-title-main">Reservoir simulation</span> Using computer models to predict the flow of fluids through porous media

Reservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media.

A stochastic simulation is a simulation of a system that has variables that can change stochastically (randomly) with individual probabilities.

Systems immunology is a research field under systems biology that uses mathematical approaches and computational methods to examine the interactions within cellular and molecular networks of the immune system. The immune system has been thoroughly analyzed as regards to its components and function by using a "reductionist" approach, but its overall function can't be easily predicted by studying the characteristics of its isolated components because they strongly rely on the interactions among these numerous constituents. It focuses on in silico experiments rather than in vivo.

Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mock up of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome.

Dynamic simulation is the use of a computer program to model the time-varying behavior of a dynamical system. The systems are typically described by ordinary differential equations or partial differential equations. A simulation run solves the state-equation system to find the behavior of the state variables over a specified period of time. The equation is solved through numerical integration methods to produce the transient behavior of the state variables. Simulation of dynamic systems predicts the values of model-system state variables, as they are determined by the past state values. This relationship is found by creating a model of the system.

<span class="mw-page-title-main">Process simulation</span>

Process simulation is used for the design, development, analysis, and optimization of technical process of simulation of processes such as: chemical plant s, chemical processes, environmental systems, power stations, complex manufacturing operations, biological processes, and similar technical functions.

GoldSim is dynamic, probabilistic simulation software developed by GoldSim Technology Group. This general-purpose simulator is a hybrid of several simulation approaches, combining an extension of system dynamics with some aspects of discrete event simulation, and embedding the dynamic simulation engine within a Monte Carlo simulation framework.

<span class="mw-page-title-main">Cellular model</span>

A cellular model is a mathematical model of aspects of a biological cell, for the purposes of in silico research.

MLDesigner is an integrated modeling and simulation tool for the design and analysis of complex embedded and networked systems. MLDesigner speeds up modeling, simulation and analysis of discrete event, discrete time and continuous time systems concerning architecture, function and performance. The tools is based on ideas of the "Ptolemy Project", done at the University if California Berkeley. MLDesigner is developed by MLDesign Technologies Inc. Palo Alto, CA, USA in collaboration with Mission Level Design GmbH, Ilmenau, Germany.

Equation-free modeling is a method for multiscale computation and computer-aided analysis. It is designed for a class of complicated systems in which one observes evolution at a macroscopic, coarse scale of interest, while accurate models are only given at a finely detailed, microscopic, level of description. The framework empowers one to perform macroscopic computational tasks using only appropriately initialized microscopic simulation on short time and small length scales. The methodology eliminates the derivation of explicit macroscopic evolution equations when these equations conceptually exist but are not available in closed form; hence the term equation-free.

<span class="mw-page-title-main">Microscale and macroscale models</span> Classes of computational models

Microscale models form a broad class of computational models that simulate fine-scale details, in contrast with macroscale models, which amalgamate details into select categories. Microscale and macroscale models can be used together to understand different aspects of the same problem.

Multi-state modeling of biomolecules refers to a series of techniques used to represent and compute the behaviour of biological molecules or complexes that can adopt a large number of possible functional states.

<span class="mw-page-title-main">Simulation-based optimization</span>

Simulation-based optimization integrates optimization techniques into simulation modeling and analysis. Because of the complexity of the simulation, the objective function may become difficult and expensive to evaluate. Usually, the underlying simulation model is stochastic, so that the objective function must be estimated using statistical estimation techniques.

References

  1. Strogatz, Steven (2007). "The End of Insight". In Brockman, John (ed.). What is your dangerous idea?. HarperCollins. ISBN   9780061214950.
  2. "Researchers stage largest Military Simulation ever". Jet Propulsion Laboratory . Caltech. December 4, 1997. Archived from the original on 2008-01-22.
  3. "Molecular Simulation of Macroscopic Phenomena". IBM Research - Almaden. Archived from the original on 2013-05-22.
  4. Ambrosiano, Nancy (October 19, 2005). "Largest computational biology simulation mimics life's most essential nanomachine". Los Alamos, NM: Los Alamos National Laboratory. Archived from the original on 2007-07-04.
  5. Graham-Rowe, Duncan (June 6, 2005). "Mission to build a simulated brain begins". New Scientist . Archived from the original on 2015-02-09.
  6. Santner, Thomas J; Williams, Brian J; Notz, William I (2003). The design and analysis of computer experiments. Springer Verlag.
  7. Bratley, Paul; Fox, Bennet L.; Schrage, Linus E. (2011-06-28). A Guide to Simulation. Springer Science & Business Media. ISBN   9781441987242.
  8. John Robert Taylor (1999). An Introduction to Error Analysis: The Study of Uncertainties in Physical Measurements. University Science Books. pp. 128–129. ISBN   978-0-935702-75-0. Archived from the original on 2015-03-16.
  9. 1 2 Gupta, Ankur; Rawlings, James B. (April 2014). "Comparison of Parameter Estimation Methods in Stochastic Chemical Kinetic Models: Examples in Systems Biology". AIChE Journal. 60 (4): 1253–1268. Bibcode:2014AIChE..60.1253G. doi:10.1002/aic.14409. ISSN   0001-1541. PMC   4946376 . PMID   27429455.
  10. Atanasov, AG; Waltenberger, B; Pferschy-Wenzig, EM; Linder, T; Wawrosch, C; Uhrin, P; Temml, V; Wang, L; Schwaiger, S; Heiss, EH; Rollinger, JM; Schuster, D; Breuss, JM; Bochkov, V; Mihovilovic, MD; Kopp, B; Bauer, R; Dirsch, VM; Stuppner, H (2015). "Discovery and resupply of pharmacologically active plant-derived natural products: A review". Biotechnol Adv. 33 (8): 1582–614. doi:10.1016/j.biotechadv.2015.08.001. PMC   4748402 . PMID   26281720.
  11. Mizukami, Koichi; Saito, Fumio; Baron, Michel. Study on grinding of pharmaceutical products with an aid of computer simulation Archived 2011-07-21 at the Wayback Machine
  12. Mesly, Olivier (2015). Creating Models in Psychological Research. United States: Springer Psychology: 126 pages. ISBN   978-3-319-15752-8
  13. Wilensky, Uri; Rand, William (2007). "Making Models Match: Replicating an Agent-Based Model". Journal of Artificial Societies and Social Simulation. 10 (4): 2.
  14. Wescott, Bob (2013). The Every Computer Performance Book, Chapter 7: Modeling Computer Performance. CreateSpace. ISBN   978-1482657753.
  15. Baase, Sara. A Gift of Fire: Social, Legal, and Ethical Issues for Computing and the Internet. 3. Upper Saddle River: Prentice Hall, 2007. Pages 363–364. ISBN   0-13-600848-8.

Further reading