Simulation governance

Last updated

Simulation governance is a managerial function concerned with assurance of reliability of information generated by numerical simulation. The term was introduced in 2011 [1] and specific technical requirements were addressed from the perspective of mechanical design in 2012. [2] Its strategic importance was addressed in 2015. [3] [4] At the 2017 NAFEMS World Congress in Stockholm simulation governance was identified as the first of eight “big issues” in numerical simulation.

Simulation governance is concerned with (a) selection and adoption of the best available simulation technology, (b) formulation of mathematical models, (c) management of experimental data, (d) data and solution verification procedures, and (e) revision of mathematical models in the light of new information collected from physical experiments and field observations. [5]

Plans for simulation governance have to be formulated to fit the mission of each organization or department within an organization: In the terminology of structural and mechanical engineering, typical missions are:

  1. Application of established rules of design and certification: Given the allowable value defined in a design rule , show that .
  2. Formulation of design rules (typically for new materials or material systems): What is ? This involves the interpretation of results from coupon tests and component tests.
  3. Condition-based maintenance (typically of high-value assets): Given a detected flaw, what is the probability that failure will occur after load cycles?
  4. Structural analysis of large structures (such as airframes, marine structures, automobiles under crash conditions).

Note that items 1 to 3 require strength analysis where the quantities of interest are related to the first derivatives of the displacement field. Item 4 refers to structural analysis where the quantities of interest are force-displacement relations or accelerations (as in crash dynamics). This distinction is important because in strength analysis errors associated with the formulation of mathematical models and their numerical solution, for example by the finite element method, must be treated separately and verification, validation and uncertainty quantification must be applied. [6] [7]

In structural analysis on the other hand, numerical problems, typically constructed by assembling elements from a finite element library, a method known as finite element modeling, can produce satisfactory results. In this case the numerical solution stands on its own, typically it is not an approximation to a well-posed mathematical problem. Therefore, neither solution verification nor model validation can be performed. Satisfactory results can be produced by artful tuning of finite element models with reference to sets of experimental data so that two large errors nearly cancel one another: One error is conceptual: inadmissible data violate basic assumptions in the formulation. The other error is numerical: one or more quantities of interest diverge but the rate of divergence is slow and may not be visible at mesh refinements used in practice. [6]

Related Research Articles

Numerical analysis Field of mathematics

Numerical analysis is the study of algorithms that use numerical approximation for the problems of mathematical analysis. Numerical analysis finds application in all fields of engineering and the physical sciences, and in the 21st century also the life and social sciences, medicine, business and even the arts. Current growth in computing power has enabled the use of more complex numerical analysis, providing detailed and realistic mathematical models in science and engineering. Examples of numerical analysis include: ordinary differential equations as found in celestial mechanics, numerical linear algebra in data analysis, and stochastic differential equations and Markov chains for simulating living cells in medicine and biology.

Computational fluid dynamics Branch of fluid mechanics that uses numerical analysis and data structures to solve and analyze problems that involve fluid flows

Computational fluid dynamics (CFD) is a branch of fluid mechanics that uses numerical analysis and data structures to analyze and solve problems that involve fluid flows. Computers are used to perform the calculations required to simulate the free-stream flow of the fluid, and the interaction of the fluid with surfaces defined by boundary conditions. With high-speed supercomputers, better solutions can be achieved, and are often required to solve the largest and most complex problems. Ongoing research yields software that improves the accuracy and speed of complex simulation scenarios such as transonic or turbulent flows. Initial validation of such software is typically performed using experimental apparatus such as wind tunnels. In addition, previously performed analytical or empirical analysis of a particular problem can be used for comparison. A final validation is often performed using full-scale testing, such as flight tests.

Structural analysis is the determination of the effects of loads on physical structures and their components. Structures subject to this type of analysis include all that must withstand loads, such as buildings, bridges, aircraft and ships. Structural analysis employs the fields of applied mechanics, materials science and applied mathematics to compute a structure's deformations, internal forces, stresses, support reactions, accelerations, and stability. The results of the analysis are used to verify a structure's fitness for use, often precluding physical tests. Structural analysis is thus a key part of the engineering design of structures.

A computer experiment or simulation experiment is an experiment used to study a computer simulation, also referred to as an in silico system. This area includes computational physics, computational chemistry, computational biology and other similar disciplines.

Computational science, also known as scientific computing or scientific computation (SC), is a field in mathematics that uses advanced computing capabilities to understand and solve complex problems. It is an area of science that spans many disciplines, but at its core, it involves the development of models and simulations to understand natural systems.

Computational electromagnetics Branch of physics

Computational electromagnetics (CEM), computational electrodynamics or electromagnetic modeling is the process of modeling the interaction of electromagnetic fields with physical objects and the environment.

Computational mechanics is the discipline concerned with the use of computational methods to study phenomena governed by the principles of mechanics. Before the emergence of computational science as a "third way" besides theoretical and experimental sciences, computational mechanics was widely considered to be a sub-discipline of applied mechanics. It is now considered to be a sub-discipline within computational science.

Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.

Meshfree methods Methods in numerical analysis not requiring knowledge of neighboring points

In the field of numerical analysis, meshfree methods are those that do not require connection between nodes of the simulation domain, i.e. a mesh, but are rather based on interaction of each node with all its neighbors. As a consequence, original extensive properties such as mass or kinetic energy are no longer assigned to mesh elements but rather to the single nodes. Meshfree methods enable the simulation of some otherwise difficult types of problems, at the cost of extra computing time and programming effort. The absence of a mesh allows Lagrangian simulations, in which the nodes can move according to the velocity field.

In the numerical solution of partial differential equations, a topic in mathematics, the spectral element method (SEM) is a formulation of the finite element method (FEM) that uses high degree piecewise polynomials as basis functions. The spectral element method was introduced in a 1984 paper by A. T. Patera. Although Patera is credited with development of the method, his work was a rediscovery of an existing method

Finite element method Numerical method for solving physical or engineering problems

The finite element method (FEM) is a popular method for numerically solving differential equations arising in engineering and mathematical modeling. Typical problem areas of interest include the traditional fields of structural analysis, heat transfer, fluid flow, mass transport, and electromagnetic potential.

hp-FEM is a general version of the finite element method (FEM), a numerical method for solving partial differential equations based on piecewise-polynomial approximations that employs elements of variable size (h) and polynomial degree (p). The origins of hp-FEM date back to the pioneering work of Barna A. Szabó and Ivo Babuška who discovered that the finite element method converges exponentially fast when the mesh is refined using a suitable combination of h-refinements (dividing elements into smaller ones) and p-refinements. The exponential convergence makes the method a very attractive choice compared to most other finite element methods which only converge with an algebraic rate. The exponential convergence of the hp-FEM was not only predicted theoretically but also observed by numerous independent researchers.

Smoothed finite element methods (S-FEM) are a particular class of numerical simulation algorithms for the simulation of physical phenomena. It was developed by combining meshfree methods with the finite element method. S-FEM are applicable to solid mechanics as well as fluid dynamics problems, although so far they have mainly been applied to the former.

Weakened weak form is used in the formulation of general numerical methods based on meshfree methods and/or finite element method settings. These numerical methods are applicable to solid mechanics as well as fluid dynamics problems.

The Kansa method is a computer method used to solve partial differential equations. Its main advantage is it is very easy to understand and program on a computer. It is much less complicated than the finite element method. Another advantage is it works well on multi variable problems. The finite element method is complicated when working with more than 3 space variables and time.

In mathematics, error analysis is the study of kind and quantity of error, or uncertainty, that may be present in the solution to a problem. This issue is particularly prominent in applied areas such as numerical analysis and statistics.

p-FEM or the p-version of the finite element method is a numerical method for solving partial differential equations. It is a discretization strategy in which the finite element mesh is fixed and the polynomial degrees of elements are increased such that the lowest polynomial degree, denoted by , approaches infinity. This is in contrast with the "h-version" or "h-FEM", a widely used discretization strategy, in which the polynomial degrees of elements are fixed and the mesh is refined such that the diameter of the largest element, denoted by approaches zero.

Numerical modeling (geology) Technique to solve geological problems by computational simulation

In geology, numerical modeling is a widely applied technique to tackle complex geological problems by computational simulation of geological scenarios.

Barna Szabó

Barna A. Szabó is a Hungarian-American engineer and educator, noted for his contributions on the finite element method, particularly the conception and implementation of the p- and hp-versions of the Finite Element Method. He is a founding member and fellow of the United States Association for Computational Mechanics, an external member of the Hungarian Academy of Sciences and fellow of the St. Louis Academy of Sciences.

Probabilistic numerics is a scientific field at the intersection of statistics, machine learning and applied mathematics, where tasks in numerical analysis including finding numerical solutions for integration, linear algebra, optimisation and differential equations are seen as problems of statistical, probabilistic, or Bayesian inference.

References

  1. Szabó B. and Actis R. Simulation governance: New technical requirements for software tools in computational solid mechanics International Workshop on Verification and Validation in Computational Science University of Notre Dame 17–19 October 2011.
  2. Szabó B. and Actis R. Simulation governance: Technical requirements for mechanical design. Comput. Methods Appl. Mech. Engrg. 249–252 158–168, 2012.
  3. Meintjes K. Simulation Governance: Managing Simulation as a Strategic Capability. NAFEMS Benchmark Magazine, January 2015.
  4. Imbert J-F. Simulation Governance - Building confidence, a key dimension of simulation strategy. NAFEMS World Congress NWC15 San Diego, June 2015.
  5. Oberkampf WL and Pilch M. Simulation Verification and Validation for Managers. NAFEMS, 2017. ISBN   978-1-910643-33-4.
  6. 1 2 Szabó B and Babuška I. Finite Element Analysis. Method, Verification and Validation. 2nd edition. John Wiley & Sons Inc. Hoboken NJ 2021.
  7. Szabó B and Babuška I. Methodology of model development in the applied sciences. Journal of Computational and Applied Mechanics. 16(2), 75--86, 2021. DOI: 10.32973/jcam.2021.005