Finite element updating

Last updated

Finite element model updating is the process of ensuring that finite element analysis results in models that better reflect the measured data than the initial models. It is part of verification and validation of numerical models.

A mathematical model is a description of a system using mathematical concepts and language. The process of developing a mathematical model is termed mathematical modeling. Mathematical models are used in the natural sciences and engineering disciplines, as well as in the social sciences.

Verification and validation are independent procedures that are used together for checking that a product, service, or system meets requirements and specifications and that it fulfills its intended purpose. These are critical components of a quality management system such as ISO 9000. The words "verification" and "validation" are sometimes preceded with "independent", indicating that the verification and validation is to be performed by a disinterested third party. "Independent verification and validation" can be abbreviated as "IV&V".

Contents

The process

The process is conducted by first choosing the domain in which data is presented. The domains used include time domain, frequency domain, modal domain, and time-frequency domain.

Domain of a function mathematical concept

In mathematics, the domain of definition of a function is the set of "input" or argument values for which the function is defined. That is, the function provides an "output" or value for each member of the domain. Conversely, the set of values the function takes on as output is termed the image of the function, which is sometimes also referred to as the range of the function.

Time domain

Time domain refers to the analysis of mathematical functions, physical signals or time series of economic or environmental data, with respect to time. In the time domain, the signal or function's value is known for all real numbers, for the case of continuous time, or at various separate instants in the case of discrete time. An oscilloscope is a tool commonly used to visualize real-world signals in the time domain. A time-domain graph shows how a signal changes with time, whereas a frequency-domain graph shows how much of the signal lies within each given frequency band over a range of frequencies.

Frequency domain signal representation

In electronics, control systems engineering, and statistics, the frequency domain refers to the analysis of mathematical functions or signals with respect to frequency, rather than time. Put simply, a time-domain graph shows how a signal changes over time, whereas a frequency-domain graph shows how much of the signal lies within each given frequency band over a range of frequencies. A frequency-domain representation can also include information on the phase shift that must be applied to each sinusoid in order to be able to recombine the frequency components to recover the original time signal.

The second step is to determine which parts of the initial models are thought to have been modeled incorrectly.

The third task is to formulate a function which has the parameters that are expected to be design variables, and which represents the distance between the measured data and the finite element model predicted data.

Function (mathematics) Mapping that associates a single output value to each input

In mathematics, a function was originally the idealization of how a varying quantity depends on another quantity. For example, the position of a planet is a function of time. Historically, the concept was elaborated with the infinitesimal calculus at the end of the 17th century, and, until the 19th century, the functions that were considered were differentiable. The concept of function was formalized at the end of the 19th century in terms of set theory, and this greatly enlarged the domains of application of the concept.

The fourth step is to implement the optimization method to identify parameters that minimize this function. In most cases, a gradient-based optimization strategy will be used. For nonlinear analysis, more specific methods like response surface modeling, particle swarm optimization, Monte Carlo optimization, and genetic algorithms can be used. Recently, finite element model updating has been conducted using Bayesian statistics which gives a probabilistic interpretation of model updating.

Maxima and minima largest and smallest value taken by a function takes at a given point

In mathematical analysis, the maxima and minima of a function, known collectively as extrema, are the largest and smallest value of the function, either within a given range or on the entire domain of a function. Pierre de Fermat was one of the first mathematicians to propose a general technique, adequality, for finding the maxima and minima of functions.

In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other scientists because most systems are inherently nonlinear in nature. Nonlinear dynamical systems, describing changes in variables over time, may appear chaotic, unpredictable, or counterintuitive, contrasting with much simpler linear systems.

Response surface methodology

In statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM is to use a sequence of designed experiments to obtain an optimal response. Box and Wilson suggest using a second-degree polynomial model to do this. They acknowledge that this model is only an approximation, but they use it because such a model is easy to estimate and apply, even when little is known about the process.

Related Research Articles

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.

In computer science, soft computing is the use of inexact solutions to computationally hard tasks such as the solution of NP-complete problems, for which there is no known algorithm that can compute an exact solution in polynomial time. Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect the role model for soft computing is the human mind.

Model predictive control (MPC) is an advanced method of process control that is used to control a process while satisfying a set of constraints. It has been in use in the process industries in chemical plants and oil refineries since the 1980s. In recent years it has also been used in power system balancing models and in power electronics. Model predictive controllers rely on dynamic models of the process, most often linear empirical models obtained by system identification. The main advantage of MPC is the fact that it allows the current timeslot to be optimized, while keeping future timeslots in account. This is achieved by optimizing a finite time-horizon, but only implementing the current timeslot and then optimizing again, repeatedly, thus differing from Linear-Quadratic Regulator (LQR). Also MPC has the ability to anticipate future events and can take control actions accordingly. PID controllers do not have this predictive ability. MPC is nearly universally implemented as a digital control, although there is research into achieving faster response times with specially designed analog circuitry.

Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if we exactly knew the speed, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.

Tshilidzi Marwala South African academic administrator

Tshilidzi Marwala is an South African mechanical engineer. He became Professor at the University of the Witwatersrand in 2003 and also chairperson of System and Control Engineering in South Africa. He has previously worked at the CSIR and for South African Breweries

Computational engineering

Not to be confused with computer engineering.

The nested sampling algorithm is a computational approach to the problem of comparing models in Bayesian statistics, developed in 2004 by physicist John Skilling.

Polynomial chaos (PC), also called Wiener chaos expansion, is a non-sampling-based method to determine evolution of uncertainty in a dynamical system, when there is probabilistic uncertainty in the system parameters. PC was first introduced by Norbert Wiener where Hermite polynomials were used to model stochastic processes with Gaussian random variables. It can be thought of as an extension of Volterra's theory of nonlinear functionals for stochastic systems. According to Cameron and Martin such an expansion converges in the sense for any arbitrary stochastic process with finite second moment. This applies to most physical systems.

OptiY is a design environment providing modern optimization strategies and state of the art probabilistic algorithms for uncertainty, reliability, robustness, sensitivity analysis, data-mining and meta-modeling.

FEMtools is a multi-functional, cross-platform and solver-independent family of CAE software programs providing analysis and scripting solutions for many different types of applications. The program is developed, supported and licensed by Dynamic Design Solutions ("DDS") NV.

Kimeme is an open platform for multi-objective optimization and multidisciplinary design optimization. It is intended to be coupled with external numerical software such as Computer Aided Design (CAD), Finite Element Analysis (FEM), Structural analysis and Computational Fluid Dynamics tools. It was developed by Cyber Dyne Srl and provides both a design environment for problem definition and analysis and a software network infrastructure to distribute the computational load.

SmartDO is a multidisciplinary design optimization software, based on the Direct Global Search technology developed and marketed by FEA-Opt Technology. SmartDO specialized in the CAE-Based optimization, such as CAE, FEA, CAD, CFD and automatic control, with application on various physics phenomena. It is both GUI and scripting driven, allowed to be integrated with almost any kind of CAD/CAE and in-house codes.

System identification is a method of identifying or measuring the mathematical model of a system from measurements of the system inputs and outputs. The applications of system identification include any system where the inputs and outputs can be measured and include industrial processes, control systems, economic data, biology and the life sciences, medicine, social systems and many more.

Ambient modal identification, also known as Operational Modal Analysis (OMA), aims at identifying the modal properties of a structure based on vibration data collected when the structure is under its operating conditions, i.e., no initial excitation or known artificial excitation. The modal properties of a structure include primarily the natural frequencies, damping ratios and mode shapes. In an ambient vibration test the subject structure can be under a variety of excitation sources which are not measured but are assumed to be 'broadband random'. The latter is a notion that one needs to apply when developing an ambient identification method. The specific assumptions vary from one method to another. Regardless of the method used, however, proper modal identification requires that the spectral characteristics of the measured response reflect the properties of the modes rather than those of the excitation.

Bayesian Operational Modal Analysis (BAYOMA) adopts a Bayesian system identification approach for Operational Modal Analysis (OMA). Operational Modal Analysis (OMA) aims at identifying the modal properties of a constructed structure using only its (output) vibration response measured under operating conditions. The (input) excitations to the structure are not measured but are assumed to be 'ambient'. In a Bayesian context, the set of modal parameters are viewed as uncertain parameters or random variables whose probability distribution is updated from the prior distribution to the posterior distribution. The peak(s) of the posterior distribution represents the most probable value(s) (MPV) suggested by the data, while the spread of the distribution around the MPV reflects the remaining uncertainty of the parameters.

Outline of machine learning Overview of and topical guide to machine learning

The following outline is provided as an overview of and topical guide to machine learning. Machine learning is a subfield of soft computing within computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. In 1959, Arthur Samuel defined machine learning as a "field of study that gives computers the ability to learn without being explicitly programmed". Machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions.

In machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters are learned.

References

International Standard Book Number Unique numeric book identifier

The International Standard Book Number (ISBN) is a numeric commercial book identifier which is intended to be unique. Publishers purchase ISBNs from an affiliate of the International ISBN Agency.

Digital object identifier Character string used as a permanent identifier for a digital object, in a format controlled by the International DOI Foundation

In computing, a Digital Object Identifier or DOI is a persistent identifier or handle used to identify objects uniquely, standardized by the International Organization for Standardization (ISO). An implementation of the Handle System, DOIs are in wide use mainly to identify academic, professional, and government information, such as journal articles, research reports and data sets, and official publications though they also have been used to identify other types of information resources, such as commercial videos.