DIVA software

Last updated

DIVA [1] (Data-Interpolating Variational Analysis) allows the spatial interpolation/gridding of data (analysis) in an optimal way, comparable to optimal interpolation (OI), taking into account uncertainties on observations. In comparison to standard OI, used in Data assimilation, DIVA, when applied to ocean data, takes into account coastlines, sub-basins and advection because of its variational formulation on the real domain. [2] Calculations are highly optimized and rely on a finite element resolution. Tools to generate the finite element mesh are provided as well as tools to optimize the parameters of the analysis. Quality control of data can be performed and error fields can be calculated. [3] Also detrending of data is possible. Finally 3D and 4D extensions are included with emphasis on direct computations of climatologies from ODV [4] spreadsheet files.

The software whose first version was available since 1996, [5] can now be downloaded at the DIVA Archived 2017-12-01 at the Wayback Machine site and is the reference tool for calculating climatologies within the SeaDataNet projects. It has also been included as the state-of-the art gridding method in Ocean Data View.

The classical DIVA version is now superseded by an N-dimensional implementation: [6]

https://github.com/gher-uliege/DIVAnd.jl

Notes and references

  1. "DIVA homepage". Archived from the original on 2017-12-01. Retrieved 2013-01-14.
  2. "DIVA formulation". Archived from the original on 2017-12-01. Retrieved 2013-01-14.
  3. Troupin, C, Barth, A, Sirjacobs, D, Ouberdous, M, Brankart, J.-M, Brasseur, P, Rixen, M, Alvera Azcarate, A, Belounis, M, Capet, A, Lenartz, F, Toussaint, M.-E, & Beckers, J.-M. (2012). Generation of analysis and consistent error fields using the Data Interpolating Variational Analysis (Diva). Ocean Modelling, 52-53, 90-101. doi:10.1016/j.ocemod.2012.05.002
  4. ODV homepage
  5. Brasseur, P., Beckers, J.-M., Brankart, J.-M., and Schoenauen, R. (1996a). Seasonal temperature and salinity fields in the Mediterranean Sea: Climatological analyses of an historical data set. Deep-Sea Research, 43(2):159-192.
  6. Barth, A., Beckers, J.-M., Troupin, C., Alvera-Azcárate, A., and Vandenbulcke, L.: DIVAnd-1.0: n-dimensional variational data analysis for ocean observations, Geosci. Model Dev., 7, 225-241, doi:10.5194/gmd-7-225-2014, 2014.

Related Research Articles

<span class="mw-page-title-main">Interpolation</span> Method for estimating new data within known data points

In the mathematical field of numerical analysis, interpolation is a type of estimation, a method of constructing (finding) new data points based on the range of a discrete set of known data points.

<span class="mw-page-title-main">Numerical analysis</span> Study of algorithms using numerical approximation

Numerical analysis is the study of algorithms that use numerical approximation for the problems of mathematical analysis. It is the study of numerical methods that attempt to find approximate solutions of problems rather than the exact ones. Numerical analysis finds application in all fields of engineering and the physical sciences, and in the 21st century also the life and social sciences, medicine, business and even the arts. Current growth in computing power has enabled the use of more complex numerical analysis, providing detailed and realistic mathematical models in science and engineering. Examples of numerical analysis include: ordinary differential equations as found in celestial mechanics, numerical linear algebra in data analysis, and stochastic differential equations and Markov chains for simulating living cells in medicine and biology.

<span class="mw-page-title-main">Mathematical optimization</span> Study of mathematical algorithms for optimization problems

Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.

Creo Parametric, formerly known, together with Creo Elements/Pro, as Pro/Engineer and Wildfire, is a solid modeling or CAD, CAM, CAE, and associative 3D modeling application, running on Microsoft Windows.

Topology optimization is a mathematical method that optimizes material layout within a given design space, for a given set of loads, boundary conditions and constraints with the goal of maximizing the performance of the system. Topology optimization is different from shape optimization and sizing optimization in the sense that the design can attain any shape within the design space, instead of dealing with predefined configurations.

<span class="mw-page-title-main">Lanczos resampling</span> Application of a mathematical formula

Lanczos filtering and Lanczos resampling are two applications of a mathematical formula. It can be used as a low-pass filter or used to smoothly interpolate the value of a digital signal between its samples. In the latter case, it maps each sample of the given signal to a translated and scaled copy of the Lanczos kernel, which is a sinc function windowed by the central lobe of a second, longer, sinc function. The sum of these translated and scaled kernels is then evaluated at the desired points.

Data assimilation is a mathematical discipline that seeks to optimally combine theory with observations. There may be a number of different goals sought – for example, to determine the optimal state estimate of a system, to determine initial conditions for a numerical forecast model, to interpolate sparse observation data using knowledge of the system being observed, to set numerical parameters based on training a model from observed data. Depending on the goal, different solution methods may be used. Data assimilation is distinguished from other forms of machine learning, image analysis, and statistical methods in that it utilizes a dynamical model of the system being analyzed.

<span class="mw-page-title-main">Response surface methodology</span> Statistical approach

In statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. RSM is an empirical model which employs the use of mathematical and statistical techniques to relate input variables, otherwise known as factors, to the response. RSM became very useful due to the fact that other methods available, such as the theoretical model, could be very cumbersome to use, time-consuming, inefficient, error-prone, and unreliable. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM is to use a sequence of designed experiments to obtain an optimal response. Box and Wilson suggest using a second-degree polynomial model to do this. They acknowledge that this model is only an approximation, but they use it because such a model is easy to estimate and apply, even when little is known about the process.

<span class="mw-page-title-main">Meshfree methods</span> Methods in numerical analysis not requiring knowledge of neighboring points

In the field of numerical analysis, meshfree methods are those that do not require connection between nodes of the simulation domain, i.e. a mesh, but are rather based on interaction of each node with all its neighbors. As a consequence, original extensive properties such as mass or kinetic energy are no longer assigned to mesh elements but rather to the single nodes. Meshfree methods enable the simulation of some otherwise difficult types of problems, at the cost of extra computing time and programming effort. The absence of a mesh allows Lagrangian simulations, in which the nodes can move according to the velocity field.

A surrogate model is an engineering method used when an outcome of interest cannot be easily measured or computed, so an approximate mathematical model of the outcome is used instead. Most engineering design problems require experiments and/or simulations to evaluate design objective and constraint functions as a function of design variables. For example, in order to find the optimal airfoil shape for an aircraft wing, an engineer simulates the airflow around the wing for different shape variables. For many real-world problems, however, a single simulation can take many minutes, hours, or even days to complete. As a result, routine tasks such as design optimization, design space exploration, sensitivity analysis and "what-if" analysis become impossible since they require thousands or even millions of simulation evaluations.

<span class="mw-page-title-main">Reservoir simulation</span> Using computer models to predict the flow of fluids through porous media

Reservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media.

The Global Ocean Data Analysis Project (GLODAP) is a synthesis project bringing together oceanographic data, featuring two major releases as of 2018. The central goal of GLODAP is to generate a global climatology of the World Ocean's carbon cycle for use in studies of both its natural and anthropogenically forced states. GLODAP is funded by the National Oceanic and Atmospheric Administration, the U.S. Department of Energy, and the National Science Foundation.

<span class="mw-page-title-main">Finite element method</span> Numerical method for solving physical or engineering problems

The finite element method (FEM) is a extremely popular method for numerically solving differential equations arising in engineering and mathematical modeling. Typical problem areas of interest include the traditional fields of structural analysis, heat transfer, fluid flow, mass transport, and electromagnetic potential.

Ocean reanalysis is a method of combining historical ocean observations with a general ocean model driven by historical estimates of surface winds, heat, and freshwater, by way of a data assimilation algorithm to reconstruct historical changes in the state of the ocean.

Barnes interpolation, named after Stanley L. Barnes, is the interpolation of unevenly spread data points from a set of measurements of an unknown function in two dimensions into an analytic function of two variables. An example of a situation where the Barnes scheme is important is in weather forecasting where measurements are made wherever monitoring stations may be located, the positions of which are constrained by topography. Such interpolation is essential in data visualisation, e.g. in the construction of contour plots or other representations of analytic surfaces.

Optimus is a Process Integration and Design Optimization (PIDO) platform developed by Noesis Solutions. Noesis Solutions takes part in key research projects, such as PHAROS and MATRIX.

<span class="mw-page-title-main">Gerris (software)</span> Computer Software

Gerris is computer software in the field of computational fluid dynamics (CFD). Gerris was released as free and open-source software, subject to the requirements of the GNU General Public License (GPL), version 2 or any later.

Design optimization is an engineering design methodology using a mathematical formulation of a design problem to support selection of the optimal design among many alternatives. Design optimization involves the following stages:

  1. Variables: Describe the design alternatives
  2. Objective: Elected functional combination of variables
  3. Constraints: Combination of Variables expressed as equalities or inequalities that must be satisfied for any acceptable design alternative
  4. Feasibility: Values for set of variables that satisfies all constraints and minimizes/maximizes Objective.