DIVA [1] (Data-Interpolating Variational Analysis) allows the spatial interpolation/gridding of data (analysis) in an optimal way, comparable to optimal interpolation (OI), taking into account uncertainties on observations. In comparison to standard OI, used in Data assimilation, DIVA, when applied to ocean data, takes into account coastlines, sub-basins and advection because of its variational formulation on the real domain. [2] Calculations are highly optimized and rely on a finite element resolution. Tools to generate the finite element mesh are provided as well as tools to optimize the parameters of the analysis. Quality control of data can be performed and error fields can be calculated. [3] Also detrending of data is possible. Finally 3D and 4D extensions are included with emphasis on direct computations of climatologies from ODV [4] spreadsheet files.
The software whose first version was available since 1996, [5] can now be downloaded at the DIVA Archived 2017-12-01 at the Wayback Machine site and is the reference tool for calculating climatologies within the SeaDataNet projects. It has also been included as the state-of-the art gridding method in Ocean Data View.
The classical DIVA version is now superseded by an N-dimensional implementation: [6]
In the mathematical field of numerical analysis, interpolation is a type of estimation, a method of constructing (finding) new data points based on the range of a discrete set of known data points.
Numerical analysis is the study of algorithms that use numerical approximation for the problems of mathematical analysis. It is the study of numerical methods that attempt to find approximate solutions of problems rather than the exact ones. Numerical analysis finds application in all fields of engineering and the physical sciences, and in the 21st century also the life and social sciences, medicine, business and even the arts. Current growth in computing power has enabled the use of more complex numerical analysis, providing detailed and realistic mathematical models in science and engineering. Examples of numerical analysis include: ordinary differential equations as found in celestial mechanics, numerical linear algebra in data analysis, and stochastic differential equations and Markov chains for simulating living cells in medicine and biology.
Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.
Creo Parametric, formerly known, together with Creo Elements/Pro, as Pro/Engineer and Wildfire, is a solid modeling or CAD, CAM, CAE, and associative 3D modeling application, running on Microsoft Windows.
Topology optimization is a mathematical method that optimizes material layout within a given design space, for a given set of loads, boundary conditions and constraints with the goal of maximizing the performance of the system. Topology optimization is different from shape optimization and sizing optimization in the sense that the design can attain any shape within the design space, instead of dealing with predefined configurations.
Lanczos filtering and Lanczos resampling are two applications of a mathematical formula. It can be used as a low-pass filter or used to smoothly interpolate the value of a digital signal between its samples. In the latter case, it maps each sample of the given signal to a translated and scaled copy of the Lanczos kernel, which is a sinc function windowed by the central lobe of a second, longer, sinc function. The sum of these translated and scaled kernels is then evaluated at the desired points.
Data assimilation is a mathematical discipline that seeks to optimally combine theory with observations. There may be a number of different goals sought – for example, to determine the optimal state estimate of a system, to determine initial conditions for a numerical forecast model, to interpolate sparse observation data using knowledge of the system being observed, to set numerical parameters based on training a model from observed data. Depending on the goal, different solution methods may be used. Data assimilation is distinguished from other forms of machine learning, image analysis, and statistical methods in that it utilizes a dynamical model of the system being analyzed.
In statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. RSM is an empirical model which employs the use of mathematical and statistical techniques to relate input variables, otherwise known as factors, to the response. RSM became very useful due to the fact that other methods available, such as the theoretical model, could be very cumbersome to use, time-consuming, inefficient, error-prone, and unreliable. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM is to use a sequence of designed experiments to obtain an optimal response. Box and Wilson suggest using a second-degree polynomial model to do this. They acknowledge that this model is only an approximation, but they use it because such a model is easy to estimate and apply, even when little is known about the process.
In the field of numerical analysis, meshfree methods are those that do not require connection between nodes of the simulation domain, i.e. a mesh, but are rather based on interaction of each node with all its neighbors. As a consequence, original extensive properties such as mass or kinetic energy are no longer assigned to mesh elements but rather to the single nodes. Meshfree methods enable the simulation of some otherwise difficult types of problems, at the cost of extra computing time and programming effort. The absence of a mesh allows Lagrangian simulations, in which the nodes can move according to the velocity field.
A surrogate model is an engineering method used when an outcome of interest cannot be easily measured or computed, so an approximate mathematical model of the outcome is used instead. Most engineering design problems require experiments and/or simulations to evaluate design objective and constraint functions as a function of design variables. For example, in order to find the optimal airfoil shape for an aircraft wing, an engineer simulates the airflow around the wing for different shape variables. For many real-world problems, however, a single simulation can take many minutes, hours, or even days to complete. As a result, routine tasks such as design optimization, design space exploration, sensitivity analysis and "what-if" analysis become impossible since they require thousands or even millions of simulation evaluations.
Reservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media.
The Global Ocean Data Analysis Project (GLODAP) is a synthesis project bringing together oceanographic data, featuring two major releases as of 2018. The central goal of GLODAP is to generate a global climatology of the World Ocean's carbon cycle for use in studies of both its natural and anthropogenically forced states. GLODAP is funded by the National Oceanic and Atmospheric Administration, the U.S. Department of Energy, and the National Science Foundation.
The finite element method (FEM) is a extremely popular method for numerically solving differential equations arising in engineering and mathematical modeling. Typical problem areas of interest include the traditional fields of structural analysis, heat transfer, fluid flow, mass transport, and electromagnetic potential.
Ocean reanalysis is a method of combining historical ocean observations with a general ocean model driven by historical estimates of surface winds, heat, and freshwater, by way of a data assimilation algorithm to reconstruct historical changes in the state of the ocean.
Barnes interpolation, named after Stanley L. Barnes, is the interpolation of unevenly spread data points from a set of measurements of an unknown function in two dimensions into an analytic function of two variables. An example of a situation where the Barnes scheme is important is in weather forecasting where measurements are made wherever monitoring stations may be located, the positions of which are constrained by topography. Such interpolation is essential in data visualisation, e.g. in the construction of contour plots or other representations of analytic surfaces.
Optimus is a Process Integration and Design Optimization (PIDO) platform developed by Noesis Solutions. Noesis Solutions takes part in key research projects, such as PHAROS and MATRIX.
Gerris is computer software in the field of computational fluid dynamics (CFD). Gerris was released as free and open-source software, subject to the requirements of the GNU General Public License (GPL), version 2 or any later.
Design optimization is an engineering design methodology using a mathematical formulation of a design problem to support selection of the optimal design among many alternatives. Design optimization involves the following stages: