Data assimilation

Last updated

Data assimilation refers to a large group of methods that update information from numerical computer models with information from observations. Data assimilation is used to update model states, model trajectories over time, model parameters, and combinations thereof. What distinguishes data assimilation from other estimation methods is that the computer model is a dynamical model, i.e. the model describes how model variables change over time, and its firm mathematical foundation in Bayesian Inference. As such, it generalizes inverse methods and has close connections with machine learning.

Contents

Data assimilation initially developed in the field of numerical weather prediction. Numerical weather prediction models are equations describing the evolution of the atmosphere, typically coded into a computer program. When these models are used for forecasting the model output quickly deviates from the real atmosphere. Hence, we use observations of the atmosphere to keep the model on track. Data assimilation provides a very large number of practical ways to bring these observations into the models.

Simply inserting point-wise measurements into the numerical models did not provide a satisfactory solution. Real world measurements contain errors both due to the quality of the instrument and how accurately the position of the measurement is known. These errors can cause instabilities in the models that eliminate any level of skill in a forecast. Thus, more sophisticated methods were needed in order to initialize a model using all available data while making sure to maintain stability in the numerical model. Such data typically includes the measurements as well as a previous forecast valid at the same time the measurements are made. If applied iteratively, this process begins to accumulate information from past observations into all subsequent forecasts.

Because data assimilation developed out of the field of numerical weather prediction, it initially gained popularity amongst the geosciences. In fact, one of the most cited publication in all of the geosciences is an application of data assimilation to reconstruct the observed history of the atmosphere. [1]

Details of the data assimilation process

Classically, data assimilation has been applied to chaotic dynamical systems that are too difficult to predict using simple extrapolation methods. The cause of this difficulty is that small changes in initial conditions can lead to large changes in prediction accuracy. This is sometimes known as the butterfly effect – the sensitive dependence on initial conditions in which a small change in one state of a deterministic nonlinear system can result in large differences in a later state.

At any update time, data assimilation usually takes a forecast (also known as the first guess, or background information) and applies a correction to the forecast based on a set of observed data and estimated errors that are present in both the observations and the forecast itself. The difference between the forecast and the observations at that time is called the departure or the innovation (as it provides new information to the data assimilation process). A weighting factor is applied to the innovation to determine how much of a correction should be made to the forecast based on the new information from the observations. The best estimate of the state of the system based on the correction to the forecast determined by a weighting factor times the innovation is called the analysis. In one dimension, computing the analysis could be as simple as forming a weighted average of a forecasted and observed value. In multiple dimensions the problem becomes more difficult. Much of the work in data assimilation is focused on adequately estimating the appropriate weighting factor based on intricate knowledge of the errors in the system.

The measurements are usually made of a real-world system, rather than of the model's incomplete representation of that system, and so a special function called the observation operator (usually depicted by h() for a nonlinear operator or H for its linearization) is needed to map the modeled variable to a form that can be directly compared with the observation.

Data assimilation as statistical estimation

One of the common mathematical philosophical perspectives is to view data assimilation as a Bayesian estimation problem. From this perspective, the analysis step is an application of Bayes' theorem and the overall assimilation procedure is an example of recursive Bayesian estimation. However, the probabilistic analysis is usually simplified to a computationally feasible form. Advancing the probability distribution in time would be done exactly in the general case by the Fokker–Planck equation, but that is not feasible for high-dimensional systems; so, various approximations operating on simplified representations of the probability distributions are used instead. Often the probability distributions are assumed Gaussian so that they can be represented by their mean and covariance, which gives rise to the Kalman filter.

Many methods represent the probability distributions only by the mean and input some pre-calculated covariance. An example of a direct (or sequential) method to compute this is called optimal statistical interpolation, or simply optimal interpolation (OI). An alternative approach is to iteratively solve a cost function that solves an identical problem. These are called variational methods, such as 3D-Var and 4D-Var. Typical minimization algorithms are the conjugate gradient method or the generalized minimal residual method. The ensemble Kalman filter is sequential method that uses a Monte Carlo approach to estimate both the mean and the covariance of a Gaussian probability distribution by an ensemble of simulations. More recently, hybrid combinations of ensemble approaches and variational methods have become more popular (e.g. they are used for operational forecasts both at the European Centre for Medium-Range Weather Forecasts (ECMWF) and at the NOAA National Centers for Environmental Prediction (NCEP).

Data assimilation as a model update

Data assimilation can also be achieved within a model update loop, where we will iterate an initial model (or initial guess) in an optimisation loop to constrain the model to the observed data. Many optimisation approaches exist and all of them can be setup to update the model, for instance, evolutionary algorithm have proven to be efficient as free of hypothesis, but computationally expensive.

Weather forecasting applications

In numerical weather prediction applications, data assimilation is most widely known as a method for combining observations of meteorological variables such as temperature and atmospheric pressure with prior forecasts in order to initialize numerical forecast models.

Necessity

The atmosphere is a fluid. The idea of numerical weather prediction is to sample the state of the fluid at a given time and use the equations of fluid dynamics and thermodynamics to estimate the state of the fluid at some time in the future. The process of entering observation data into the model to generate initial conditions is called initialization. On land, terrain maps available at resolutions down to 1 kilometer (0.6 mi) globally are used to help model atmospheric circulations within regions of rugged topography, in order to better depict features such as downslope winds, mountain waves and related cloudiness that affects incoming solar radiation. [2] The main inputs from country-based weather services are observations from devices (called radiosondes) in weather balloons that measure various atmospheric parameters and transmits them to a fixed receiver, as well as from weather satellites. The World Meteorological Organization acts to standardize the instrumentation, observing practices and timing of these observations worldwide. Stations either report hourly in METAR reports, [3] or every six hours in SYNOP reports. [4] These observations are irregularly spaced, so they are processed by data assimilation and objective analysis methods, which perform quality control and obtain values at locations usable by the model's mathematical algorithms. [5] Some global models use finite differences, in which the world is represented as discrete points on a regularly spaced grid of latitude and longitude; [6] other models use spectral methods that solve for a range of wavelengths. The data are then used in the model as the starting point for a forecast. [7]

A variety of methods are used to gather observational data for use in numerical models. Sites launch radiosondes in weather balloons which rise through the troposphere and well into the stratosphere. [8] Information from weather satellites is used where traditional data sources are not available. Commerce provides pilot reports along aircraft routes [9] and ship reports along shipping routes. [10] Research projects use reconnaissance aircraft to fly in and around weather systems of interest, such as tropical cyclones. [11] [12] Reconnaissance aircraft are also flown over the open oceans during the cold season into systems which cause significant uncertainty in forecast guidance, or are expected to be of high impact from three to seven days into the future over the downstream continent. [13] Sea ice began to be initialized in forecast models in 1971. [14] Efforts to involve sea surface temperature in model initialization began in 1972 due to its role in modulating weather in higher latitudes of the Pacific. [15]

History

Lewis Fry Richardson Lewis Fry Richardson.png
Lewis Fry Richardson

In 1922, Lewis Fry Richardson published the first attempt at forecasting the weather numerically. Using a hydrostatic variation of Bjerknes's primitive equations, [16] Richardson produced by hand a 6-hour forecast for the state of the atmosphere over two points in central Europe, taking at least six weeks to do so. [17] His forecast calculated that the change in surface pressure would be 145 millibars (4.3  inHg ), an unrealistic value incorrect by two orders of magnitude. The large error was caused by an imbalance in the pressure and wind velocity fields used as the initial conditions in his analysis, [16] indicating the need for a data assimilation scheme.

Originally "subjective analysis" had been used in which numerical weather prediction (NWP) forecasts had been adjusted by meteorologists using their operational expertise. Then "objective analysis" (e.g. Cressman algorithm) was introduced for automated data assimilation. These objective methods used simple interpolation approaches, and thus[ why? ] were 3DDA (three-dimensional data assimilation) methods.

Later, 4DDA (four-dimensional data assimilation) methods, called "nudging", were developed, such as in the MM5 model. They are based on the simple idea of Newtonian relaxation (the 2nd axiom of Newton). They introduce into the right part of dynamical equations of the model a term that is proportional to the difference of the calculated meteorological variable and the observed value. This term that has a negative sign keeps the calculated state vector closer to the observations. Nudging can be interpreted as a variant of the Kalman-Bucy filter (a continuous time version of the Kalman filter) with the gain matrix prescribed rather than obtained from covariances.[ citation needed ]

A major development was achieved by L. Gandin (1963) who introduced the "statistical interpolation" (or "optimal interpolation") method, which developed earlier ideas of Kolmogorov. This is a 3DDA method and is a type of regression analysis which utilizes information about the spatial distributions of covariance functions of the errors of the "first guess" field (previous forecast) and "true field". These functions are never known. However, the different approximations were assumed.[ citation needed ]

The optimal interpolation algorithm is the reduced version of the Kalman filtering (KF) algorithm and in which the covariance matrices are not calculated from the dynamical equations but are pre-determined in advance.

Attempts to introduce the KF algorithms as a 4DDA tool for NWP models came later. However, this was (and remains) a difficult task because the full version requires solution of the enormous number of additional equations (~N*N~10**12, where N=Nx*Ny*Nz is the size of the state vector, Nx~100, Ny~100, Nz~100 – the dimensions of the computational grid). To overcome this difficulty, approximate or suboptimal Kalman filters were developed. These include the Ensemble Kalman filter and the Reduced-Rank Kalman filters (RRSQRT). [18]

Another significant advance in the development of the 4DDA methods was utilizing the optimal control theory (variational approach) in the works of Le Dimet and Talagrand (1986), based on the previous works of J.-L. Lions and G. Marchuk, the latter being the first to apply that theory in the environmental modeling. The significant advantage of the variational approaches is that the meteorological fields satisfy the dynamical equations of the NWP model and at the same time they minimize the functional, characterizing their difference from observations. Thus, the problem of constrained minimization is solved. The 3DDA variational methods were developed for the first time by Sasaki (1958).

As was shown by Lorenc (1986), all the above-mentioned 4DDA methods are in some limit equivalent, i.e. under some assumptions they minimize the same cost function. However, in practical applications these assumptions are never fulfilled, the different methods perform differently and generally it is not clear what approach (Kalman filtering or variational) is better. The fundamental questions also arise in application of the advanced DA techniques such as convergence of the computational method to the global minimum of the functional to be minimised. For instance, cost function or the set in which the solution is sought can be not convex. The 4DDA method which is currently most successful [19] [20] is hybrid incremental 4D-Var, where an ensemble is used to augment the climatological background error covariances at the start of the data assimilation time window, but the background error covariances are evolved during the time window by a simplified version of the NWP forecast model. This data assimilation method is used operationally at forecast centres such as the Met Office. [21] [22]

Cost function

The process of creating the analysis in data assimilation often involves minimization of a cost function. A typical cost function would be the sum of the squared deviations of the analysis values from the observations weighted by the accuracy of the observations, plus the sum of the squared deviations of the forecast fields and the analyzed fields weighted by the accuracy of the forecast. This has the effect of making sure that the analysis does not drift too far away from observations and forecasts that are known to usually be reliable.[ citation needed ]

3D-Var

where denotes the background error covariance, the observational error covariance.

4D-Var

provided that is a linear operator (matrix).

Future development

Factors driving the rapid development of data assimilation methods for NWP models include:

Other applications

Monitoring water and energy transfers

General Data Assimilation diagram (Alpilles-ReSeDA) ReSeDAssimilationDiagram.png
General Data Assimilation diagram (Alpilles-ReSeDA)

Data assimilation has been used, in the 1980s and 1990s, in several HAPEX (Hydrologic and Atmospheric Pilot Experiment) projects for monitoring energy transfers between the soil, vegetation and atmosphere. For instance:

- HAPEX-MobilHy, [24] HAPEX-Sahel, [25]

- the "Alpilles-ReSeDA" (Remote Sensing Data Assimilation) experiment, [26] [27] a European project in the FP4-ENV program [28] which took place in the Alpilles region, South-East of France (1996–97). The Flow-chart diagram (right), excerpted from the final report of that project, [23] shows how to infer variables of interest such as canopy state, radiative fluxes, environmental budget, production in quantity and quality, from remote sensing data and ancillary information. In that diagram, the small blue-green arrows indicate the direct way the models actually run.[ citation needed ] [29]

Other forecasting applications

Data assimilation methods are currently also used in other environmental forecasting problems, e.g. in hydrological and hydrogeological forecasting. [30] Bayesian networks may also be used in a data assimilation approach to assess natural hazards such as landslides. [31]

Given the abundance of spacecraft data for other planets in the solar system, data assimilation is now also applied beyond the Earth to obtain re-analyses of the atmospheric state of extraterrestrial planets. Mars is the only extraterrestrial planet to which data assimilation has been applied so far. Available spacecraft data include, in particular, retrievals of temperature and dust/water/ice optical thicknesses from the Thermal Emission Spectrometer onboard NASA's Mars Global Surveyor and the Mars Climate Sounder onboard NASA's Mars Reconnaissance Orbiter. Two methods of data assimilation have been applied to these datasets: an Analysis Correction scheme [32] and two Ensemble Kalman Filter schemes, [33] [34] both using a global circulation model of the martian atmosphere as forward model. The Mars Analysis Correction Data Assimilation (MACDA) dataset is publicly available from the British Atmospheric Data Centre. [35]

Data assimilation is a part of the challenge for every forecasting problem.

Dealing with biased data is a serious challenge in data assimilation. Further development of methods to deal with biases will be of particular use. If there are several instruments observing the same variable then intercomparing them using probability distribution functions can be instructive.[ citation needed ]

The numerical forecast models are becoming of higher resolution due to the increase of computational power, with operational atmospheric models now running with horizontal resolutions of order of 1 km (e.g. at the German National Meteorological Service, the Deutscher Wetterdienst (DWD) and Met Office in the UK). This increase in horizontal resolutions is starting to allow to resolve more chaotic features of the non-linear models, e.g. to resolve convection on the grid scale, or clouds, in the atmospheric models. This increasing non-linearity in the models and observation operators poses a new problem in the data assimilation. The existing data assimilation methods such as many variants of ensemble Kalman filters and variational methods, well established with linear or near-linear models, are being assessed on non-linear models.

Many new methods are being developed, e.g. particle filters for high-dimensional problems, and hybrid data assimilation methods. [36]

Other uses include trajectory estimation for the Apollo program, GPS, and atmospheric chemistry.

See also

Related Research Articles

<span class="mw-page-title-main">Principal component analysis</span> Method of data analysis

Principal component analysis (PCA) is a linear dimensionality reduction technique with applications in exploratory data analysis, visualization and data preprocessing.

<span class="mw-page-title-main">Kalman filter</span> Algorithm that estimates unknowns from a series of measurements over time

In statistics and control theory, Kalman filtering is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, to produce estimates of unknown variables that tend to be more accurate than those based on a single measurement, by estimating a joint probability distribution over the variables for each time-step. The filter is constructed as a mean squared error minimiser, but an alternative derivation of the filter is also provided showing how the filter relates to maximum likelihood statistics. The filter is named after Rudolf E. Kálmán.

<span class="mw-page-title-main">Numerical weather prediction</span> Weather prediction using mathematical models of the atmosphere and oceans

Numerical weather prediction (NWP) uses mathematical models of the atmosphere and oceans to predict the weather based on current weather conditions. Though first attempted in the 1920s, it was not until the advent of computer simulation in the 1950s that numerical weather predictions produced realistic results. A number of global and regional forecast models are run in different countries worldwide, using current weather observations relayed from radiosondes, weather satellites and other observing systems as inputs.

The fast Kalman filter (FKF), devised by Antti Lange (born 1941), is an extension of the Helmert–Wolf blocking (HWB) method from geodesy to safety-critical real-time applications of Kalman filtering (KF) such as GNSS navigation up to the centimeter-level of accuracy and satellite imaging of the Earth including atmospheric tomography.

<span class="mw-page-title-main">Environmental Modeling Center</span> United States weather agency

The Environmental Modeling Center (EMC) is a United States Government agency, which improves numerical weather, marine and climate predictions at the National Centers for Environmental Prediction (NCEP), through a broad program of research in data assimilation and modeling. In support of the NCEP operational forecasting mission, the EMC develops, improves and monitors data assimilation systems and models of the atmosphere, ocean and coupled system, using advanced methods developed internally as well as cooperatively with scientists from universities, NOAA laboratories and other government agencies, and the international scientific community.

Recursive least squares (RLS) is an adaptive filter algorithm that recursively finds the coefficients that minimize a weighted linear least squares cost function relating to the input signals. This approach is in contrast to other algorithms such as the least mean squares (LMS) that aim to reduce the mean square error. In the derivation of the RLS, the input signals are considered deterministic, while for the LMS and similar algorithms they are considered stochastic. Compared to most of its competitors, the RLS exhibits extremely fast convergence. However, this benefit comes at the cost of high computational complexity.

<span class="mw-page-title-main">Ensemble forecasting</span> Multiple simulation method for weather forecasting

Ensemble forecasting is a method used in or within numerical weather prediction. Instead of making a single forecast of the most likely weather, a set of forecasts is produced. This set of forecasts aims to give an indication of the range of possible future states of the atmosphere.

<span class="mw-page-title-main">Tropical cyclone forecast model</span> Computer program that uses meteorological data to forecast tropical cyclones

A tropical cyclone forecast model is a computer program that uses meteorological data to forecast aspects of the future state of tropical cyclones. There are three types of models: statistical, dynamical, or combined statistical-dynamic. Dynamical models utilize powerful supercomputers with sophisticated mathematical modeling software and meteorological data to calculate future weather conditions. Statistical models forecast the evolution of a tropical cyclone in a simpler manner, by extrapolating from historical datasets, and thus can be run quickly on platforms such as personal computers. Statistical-dynamical models use aspects of both types of forecasting. Four primary types of forecasts exist for tropical cyclones: track, intensity, storm surge, and rainfall. Dynamical models were not developed until the 1970s and the 1980s, with earlier efforts focused on the storm surge problem.

<span class="mw-page-title-main">Atmospheric model</span> Mathematical model of atmospheric motions

In atmospheric science, an atmospheric model is a mathematical model constructed around the full set of primitive, dynamical equations which govern atmospheric motions. It can supplement these equations with parameterizations for turbulent diffusion, radiation, moist processes, heat exchange, soil, vegetation, surface water, the kinematic effects of terrain, and convection. Most atmospheric models are numerical, i.e. they discretize equations of motion. They can predict microscale phenomena such as tornadoes and boundary layer eddies, sub-microscale turbulent flow over buildings, as well as synoptic and global flows. The horizontal domain of a model is either global, covering the entire Earth, or regional (limited-area), covering only part of the Earth. Atmospheric models also differ in how they compute vertical fluid motions; some types of models are thermotropic, barotropic, hydrostatic, and non-hydrostatic. These model types are differentiated by their assumptions about the atmosphere, which must balance computational speed with the model's fidelity to the atmosphere it is simulating.

The National Atmospheric Release Advisory Center (NARAC) is located at the University of California's Lawrence Livermore National Laboratory. It is a national support and resource center for planning, real-time assessment, emergency response, and detailed studies of incidents involving a wide variety of hazards, including nuclear, radiological, chemical, biological, and natural emissions.

In weather forecasting, model output statistics (MOS) is a multiple linear regression technique in which predictands, often near-surface quantities, are related statistically to one or more predictors. The predictors are typically forecasts from a numerical weather prediction (NWP) model, climatic data, and, if applicable, recent surface observations. Thus, output from NWP models can be transformed by the MOS technique into sensible weather parameters that are familiar to a layperson.

The ensemble Kalman filter (EnKF) is a recursive filter suitable for problems with a large number of variables, such as discretizations of partial differential equations in geophysical models. The EnKF originated as a version of the Kalman filter for large problems, and it is now an important data assimilation component of ensemble forecasting. EnKF is related to the particle filter but the EnKF makes the assumption that all probability distributions involved are Gaussian; when it is applicable, it is much more efficient than the particle filter.

<span class="mw-page-title-main">Singular spectrum analysis</span> Nonparametric spectral estimation method

In time series analysis, singular spectrum analysis (SSA) is a nonparametric spectral estimation method. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. Its roots lie in the classical Karhunen (1946)–Loève spectral decomposition of time series and random fields and in the Mañé (1981)–Takens (1981) embedding theorem. SSA can be an aid in the decomposition of time series into a sum of components, each having a meaningful interpretation. The name "singular spectrum analysis" relates to the spectrum of eigenvalues in a singular value decomposition of a covariance matrix, and not directly to a frequency domain decomposition.

In estimation theory, the extended Kalman filter (EKF) is the nonlinear version of the Kalman filter which linearizes about an estimate of the current mean and covariance. In the case of well defined transition models, the EKF has been considered the de facto standard in the theory of nonlinear state estimation, navigation systems and GPS.

The Unified Model is a numerical weather prediction and climate modeling software suite originally developed by the United Kingdom Met Office from 1990, and now both used and further developed by many weather-forecasting agencies around the world. The Unified Model gets its name because a single model is used across a range of both timescales and spatial scales. The models are grid-point based, rather than wave based, and are run on a variety of supercomputers around the world. The Unified Model atmosphere can be coupled to a number of ocean models. At the Met Office it is used for the main suite of Global Model, North Atlantic and Europe model (NAE) and a high-resolution UK model (UKV), in addition to a variety of Crisis Area Models and other models that can be run on demand. Similar Unified Model suites with global and regional domains are used by many other national or military weather agencies around the world for operational forecasting.

<span class="mw-page-title-main">Wildfire modeling</span>

Wildfire modeling is concerned with numerical simulation of wildfires to comprehend and predict fire behavior. Wildfire modeling aims to aid wildfire suppression, increase the safety of firefighters and the public, and minimize damage. Wildfire modeling can also aid in protecting ecosystems, watersheds, and air quality.

An atmospheric reanalysis is a meteorological and climate data assimilation project which aims to assimilate historical atmospheric observational data spanning an extended period, using a single consistent assimilation scheme throughout.

<span class="mw-page-title-main">Eugenia Kalnay</span> Argentine meteorologist (1942–2024)

Eugenia Enriqueta Kalnay was an Argentine meteorologist and a Distinguished University Professor of Atmospheric and Oceanic Science, which is part of the University of Maryland College of Computer, Mathematical, and Natural Sciences at the University of Maryland, College Park in the United States.

<span class="mw-page-title-main">History of numerical weather prediction</span> Aspect of meteorological history

The history of numerical weather prediction considers how current weather conditions as input into mathematical models of the atmosphere and oceans to predict the weather and future sea state has changed over the years. Though first attempted manually in the 1920s, it was not until the advent of the computer and computer simulation that computation time was reduced to less than the forecast period itself. ENIAC was used to create the first forecasts via computer in 1950, and over the years more powerful computers have been used to increase the size of initial datasets and use more complicated versions of the equations of motion. The development of global forecasting models led to the first climate models. The development of limited area (regional) models facilitated advances in forecasting the tracks of tropical cyclone as well as air quality in the 1970s and 1980s.

The North American Ensemble Forecast System (NAEFS) is a joint project involving the Meteorological Service of Canada (MSC) in Canada, the National Weather Service (NWS) in the United States, and the National Meteorological Service of Mexico (NMSM) in Mexico providing numerical weather prediction ensemble guidance for the 1- to 16-day forecast period. The NAEFS combines the Canadian MSC and the US NWS global ensemble prediction systems, improving probabilistic operational guidance over what can be built from any individual country's ensemble. Model guidance from the NAEFS is incorporated into the forecasts of the respective national agencies.

References

  1. Kalnay, Eugenia; and coauthors (1996). "The NCEP/NCAR 40-Year Reanalysis Project". Bulletin of the American Meteorological Society. 77 (March): 437–471. Bibcode:1996BAMS...77..437K. doi: 10.1175/1520-0477(1996)077<0437:TNYRP>2.0.CO;2 . ISSN   1520-0477. S2CID   124135431.
  2. Stensrud, David J. (2007). Parameterization schemes: keys to understanding numerical weather prediction models. Cambridge University Press. p. 56. ISBN   978-0-521-86540-1.
  3. National Climatic Data Center (2008-08-20). "Key to METAR Surface Weather Observations". National Oceanic and Atmospheric Administration . Retrieved 2011-02-11.
  4. "SYNOP Data Format (FM-12): Surface Synoptic Observations". UNISYS. 2008-05-25. Archived from the original on 2007-12-30.
  5. Krishnamurti, T N (1995). "Numerical Weather Prediction". Annual Review of Fluid Mechanics. 27: 195–225. Bibcode:1995AnRFM..27..195K. doi:10.1146/annurev.fl.27.010195.001211. S2CID   122230747.
  6. Chaudhari, H. S.; Lee, K. M.; Oh, J. H. (2007). "Weather prediction and computational aspects of icosahedral-hexagonal gridpoint model GME". In Kwon, Jang-Hyuk; Periaux, Jacques; Fox, Pat; Satofuka, N.; Ecer, A. (eds.). Parallel computational fluid dynamics: parallel computings and its applications : proceedings of the Parallel CFD 2006 Conference, Busan city, Korea (May 15–18, 2006). Elsevier. pp. 223–30. ISBN   978-0-444-53035-6 . Retrieved 2011-01-06.
  7. "The WRF Variational Data Assimilation System (WRF-Var)". University Corporation for Atmospheric Research. 2007-08-14. Archived from the original on 2007-08-14.
  8. Gaffen, Dian J. (2007-06-07). "Radiosonde Observations and Their Use in SPARC-Related Investigations". Archived from the original on 2007-06-07.
  9. Ballish, Bradley A; Kumar, V. Krishna (2008). "Systematic Differences in Aircraft and Radiosonde Temperatures". Bulletin of the American Meteorological Society. 89 (11): 1689. Bibcode:2008BAMS...89.1689B. doi: 10.1175/2008BAMS2332.1 .
  10. National Data Buoy Center (2009-01-28). "The WMO Voluntary Observing Ships (VOS) Scheme". National Oceanic and Atmospheric Administration . Retrieved 2011-02-15.
  11. 403rd Wing (2011). "The Hurricane Hunters". 53rd Weather Reconnaissance Squadron . Retrieved 2006-03-30.{{cite web}}: CS1 maint: numeric names: authors list (link)
  12. Lee, Christopher (2007-10-08). "Drone, Sensors May Open Path Into Eye of Storm". The Washington Post. Retrieved 2008-02-22.
  13. National Oceanic and Atmospheric Administration (2010-11-12). "NOAA Dispatches High-Tech Research Plane to Improve Winter Storm Forecasts" . Retrieved 2010-12-22.
  14. Stensrud, David J. (2007). Parameterization schemes: keys to understanding numerical weather prediction models. Cambridge University Press. p. 137. ISBN   978-0-521-86540-1.
  15. Houghton, John Theodore (1985). The Global Climate. Cambridge University Press archive. pp. 49–50. ISBN   978-0-521-31256-1.
  16. 1 2 Lynch, Peter (2008). "The origins of computer weather prediction and climate modeling". Journal of Computational Physics. 227 (7): 3431–3444. Bibcode:2008JCoPh.227.3431L. doi:10.1016/j.jcp.2007.02.034.
  17. Lynch, Peter (2006). "Weather Prediction by Numerical Process". The Emergence of Numerical Weather Prediction. Cambridge University Press. pp. 1–27. ISBN   978-0-521-85729-1.
  18. Todling, Ricardo, and Stephen E. Cohn. "Suboptimal schemes for atmospheric data assimilation based on the Kalman filter." Monthly Weather Review 122, no. 11 (1994): 2530-2557.
  19. "Abstract: Mesoscale ensemble 4DVAR and its comparison with EnKF and 4DVAR (91st American Meteorological Society Annual Meeting)". 27 January 2011.
  20. Yang, Eun-Gyeong; Kim, Hyun Mee (February 2021). "A comparison of variational, ensemble-based, and hybrid data assimilation methods over East Asia for two one-month periods" (PDF). Atmospheric Research. 249: 105257. Bibcode:2021AtmRe.24905257Y. doi:10.1016/j.atmosres.2020.105257. S2CID   224864029 . Retrieved 9 November 2022.
  21. Barker, Dale; Lorenc, Andrew; Clayton, Adam (September 2011). "Hybrid Variational/Ensemble Data Assimilation" (PDF).
  22. "Numerical weather prediction models".
  23. 1 2 Baret, Frederic (June 2000). "ReSeDA: Assimilation of Multi-Sensor & Multi-Temporal Remote Sensing Data to Monitor Soil & Vegetation Functioning" (PDF) (final report, European contract number ENV4CT960326). Avignon: Institut national de la recherche agronomique. p. 59. Retrieved 8 July 2019.
  24. André, Jean-Claude; Goutorbe, Jean-Paul; Perrier, Alain (1986). "HAPEX—MOBLIHY: A Hydrologic Atmospheric Experiment for the Study of Water Budget and Evaporation Flux at the Climatic Scale". Bulletin of the American Meteorological Society. 67 (2): 138. Bibcode:1986BAMS...67..138A. doi: 10.1175/1520-0477(1986)067<0138:HAHAEF>2.0.CO;2 .
  25. Goutorbe, J.P; Lebel, T; Dolman, A.J; Gash, J.H.C; Kabat, P; Kerr, Y.H; Monteny, B; Prince, S.D; Stricker, J.N.M; Tinga, A; Wallace, J.S (1997). "An overview of HAPEX-Sahel: A study in climate and desertification". Journal of Hydrology. 188–189: 4–17. Bibcode:1997JHyd..188....4G. doi:10.1016/S0022-1694(96)03308-2.
  26. Prevot L, Baret F, Chanzy A, Olioso A, Wigneron JP, Autret H, Baudin F, Bessemoulin P, Bethenod O, Blamont D, Blavoux B, Bonnefond JM, Boubkraoui S, Bouman BA, Braud I, Bruguier N, Calvet JC, Caselles V, Chauki H, Clevers JG, Coll C, Company A, Courault D, Dedieu G, Degenne P, Delecolle R, Denis H, Desprats JF, Ducros Y, Dyer D, Fies JC, Fischer A, Francois C, Gaudu JC, Gonzalez E, Goujet R, Gu XF, Guerif M, Hanocq JF, Hautecoeur O, Haverkamp R, Hobbs S, Jacob F, Jeansoulin R, Jongschaap RE, Kerr Y, King C, Laborie P, Lagouarde JP, Laques AE, et al. (July 1998). "Assimilation of Multi-Sensor and Multi-Temporal Remote Sensing Data, to Monitor Vegetation and Soil: the Alpilles-ReSeDA project" (PDF). Seattle, WA, USA: IGARSS'98, International Geoscience and Remote Sensing Symposium. Retrieved 8 July 2019.{{cite journal}}: Cite journal requires |journal= (help)
  27. Eibl, B; Mauser, W; Moulin, S; Noilhan, J; Ottle, C; Paloscia, S; Pampaloni, P; Podvin, T; Quaracino, F; Roujean, J.L; Rozier, C; Ruisi, R; Susini, C; Taconet, O; Tallet, N; Thony, J.L; Travi, Y; Van Leewen, H; Vauclin, M; Vidal-Madjar, D; Vonder, O.W (1998). "Comparison of the albedo derived from MOS-B and WIFS with NOAA-AVHRR". IGARSS '98. Sensing and Managing the Environment. 1998 IEEE International Geoscience and Remote Sensing. Symposium Proceedings. (Cat. No.98CH36174) (PDF). pp. 2402–4. doi:10.1109/IGARSS.1998.702226. ISBN   978-0-7803-4403-7. S2CID   55492076.
  28. "ReSeDA". cordis.europa.eu. Retrieved 8 July 2019.
  29. Olioso A, Prevot L, Baret F, Chanzy A, Braud I, Autret H, Baudin F, Bessemoulin P, Bethenod O, Blamont D, Blavoux B, Bonnefond JM, Boubkraoui S, Bouman BA, Bruguier N, Calvet JC, Caselles V, Chauki H, Clevers JW, Coll C, Company A, Courault D, Dedieu G, Degenne P, Delecolle R, Denis H, Desprats JF, Ducros Y, Dyer D, Fies JC, Fischer A, Francois C, Gaudu JC, Gonzalez E, Gouget R, Gu XF, Guerif M, Hanocq JF, Hautecoeur O, Haverkamp R, Hobbs S, Jacob F, Jeansoulin R, Jongschaap RE, Kerr Y, King C, Laborie P, Lagouarde JP, Laques AE, Larcena D, Laurent G, Laurent JP, Leroy M, McAneney J, Macelloni G, Moulin S, Noilhan J, Ottle C, Paloscia S, Pampaloni P, Podvin T, Quaracino F, Roujean JL, Rozier C, Ruisi R, Susini C, Taconet O, Tallet N, Thony JL, Travi Y, van Leewen H, Vauclin M, Vidal-Madjar D, Vonder OW, Weiss M, Wigneron JP (19–21 March 1998). D. Marceau (ed.). Spatial Aspects in the Alpilles-ReSeDA Project (PDF). International Workshop on Scaling and Modelling in Forestry: Applications in Remote Sensing and GIS. University of Montreal, Montréal, Québec, Canada. pp. 93–102. Retrieved 8 July 2019.
  30. Chen, Shang-Ying; Wei, Jian-Yu; Hsu, Kuo-Chin (2023-10-01). "Data assimilation for real-time subsurface flow modeling with dynamically adaptive meshless node adjustments". Engineering with Computers. 40 (3): 1893–1925. doi:10.1007/s00366-023-01897-6. ISSN   1435-5663.
  31. Cardenas, IC (2019). "On the use of Bayesian networks as a meta-modelling approach to analyse uncertainties in slope stability analysis". Georisk: Assessment and Management of Risk for Engineered Systems and Geohazards. 13 (1): 53–65. Bibcode:2019GAMRE..13...53C. doi:10.1080/17499518.2018.1498524. S2CID   216590427.
  32. "Oxford Physics: Atmospheric, Oceanic and Planetary Physics: SRC: Research". July 2019. Archived from the original on 2011-09-28. Retrieved 2011-08-19.
  33. http://www.eps.jhu.edu/~mjhoffman/pages/research.html%5B%5D
  34. "marsclimatecenter.com". marsclimatecenter.com. Retrieved 2022-04-19.
  35. http://badc.nerc.ac.uk/home/%5B%5D
  36. Vetra-Carvalho, Sanita; P. J. van Leeuwen; L. Nerger; A. Barth; A.M. Umer; P. Brasseur; P. Kirchgessner; J-M. Beckers (2018). "State-of-the-art stochastic data assimilation methods for high-dimensional non-Gaussian problems". Tellus A. 70 (1): 1445364. Bibcode:2018TellA..7045364V. doi: 10.1080/16000870.2018.1445364 . hdl: 10754/630565 .

Further reading

Examples of how variational assimilation is implemented weather forecasting at:

Other examples of assimilation: