Ocean reanalysis

Last updated

Ocean reanalysis is a method of combining historical ocean observations with a general ocean model (typically a computational model) driven by historical estimates of surface winds, heat, and freshwater, by way of a data assimilation algorithm to reconstruct historical changes in the state of the ocean.

Contents

Historical observations are sparse and insufficient for understanding the history of the ocean and its circulation. By utilizing data assimilation techniques in combination with advanced computational models of the global ocean, researchers are able to interpolate the historical observations to all points in the ocean. This process has an analog in the construction of atmospheric reanalysis and is closely related to ocean state estimation.

Current projects

A number of efforts have been initiated in recent years to apply data assimilation to estimate the physical state of the ocean, including temperature, salinity, currents, and sea level, in recent years. [1] There are three alternative state estimation approaches. The first approach is used by the ‘no-model’ analyses, for which temperature or salinity observations update a first guess provided by climatological monthly estimates.

The second approach is that of the sequential data assimilation analyses, which move forward in time from a previous analysis using a numerical simulation of the evolving temperature and other variables produced by an ocean general circulation model. The simulation provides the first guess of the state of the ocean at the next analysis time, while corrections are made to this first guess based on observations of variables such as temperature, salinity, or sea level.

The third approach is 4D-Var, which in the implementation described uses the initial conditions and surface forcing as control variables to be modified in order to be consistent with the observations as well as a numerical representation of the equations of motion through iterative solution of a giant optimization problem.

Methodologies

No-model approach

ISHII and LEVITUS begin with a first guess of the climatological monthly upper-ocean temperature based on climatologies produced by the NOAA National Oceanographic Data Center. The innovations are mapped onto the analysis levels. ISHII uses and alternative 3DVAR approach to do an objective mapping with a smaller decorrelation scale in midlatitudes (300 km) that elongates in the zonal direction by a factor of 3 at equatorial latitudes. LEVITUS begins similarly to ISHII, but uses the technique of Cressman and Barnes with a homogeneous scale of 555 km to objectively map the temperature innovation onto a uniform grid.

Sequential approaches

The sequential approaches can be further divided into those using Optimal Interpolation and its more sophisticated cousin the Kalman Filter, and those using 3D-Var. Among those mentioned above, INGV and SODA use versions of Optimal Interpolation. CERFACS, GODAS, and GFDL all use 3DVar. "To date we are unaware of any attempt to use Kalman Filter for multi-decadal ocean reanalyses." [1] The 4-Dimensional Local Ensemble Transform Kalman Filter (4D-LETKF) has been applied to the Geophysical Fluid Dynamics Laboratory's (GFDL) Modular Ocean Model (MOM2) for a 7-year ocean reanalysis from January 1997 – 2004. [2]

Variational (4D-Var) approach

One innovative attempt by GECCO has been made to apply 4D-Var to the decadal ocean estimation problem. This approach faces daunting computational challenges, but provides some interesting benefits including satisfying some conservation laws and the construction of the ocean model adjoint.

See also

Related Research Articles

Climate model Quantitative methods used to simulate climate

Numerical climate models use quantitative methods to simulate the interactions of the important drivers of climate, including atmosphere, oceans, land surface and ice. They are used for a variety of purposes from study of the dynamics of the climate system to projections of future climate. Climate models may also be qualitative models and also narratives, largely descriptive, of possible futures.

Climatology Scientific study of climate, defined as weather conditions averaged over a period of time

Climatology or climate science is the scientific study of Earth's climate, typically defined as weather conditions averaged over a period of at least 30 years. This modern field of study is regarded as a branch of the atmospheric sciences and a subfield of physical geography, which is one of the Earth sciences. Climatology now includes aspects of oceanography and biogeochemistry.

Kalman filter Algorithm that estimates unknowns from a series of measurements over time

For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement alone, by estimating a joint probability distribution over the variables for each timeframe. The filter is named after Rudolf E. Kálmán, who was one of the primary developers of its theory.

The World Ocean Database Project, or WOD, is a project established by the Intergovernmental Oceanographic Commission (IOC). The project leader is Sydney Levitus who is director of the International Council for Science (ICSU) World Data Center (WDC) for Oceanography, Silver Spring. In recognition of the success of the IOC Global Oceanographic Data Archaeological and Rescue Project, a proposal was presented at the 16th Session of the Committee on International Oceanographic Data and Information Exchange (IODE), which was held in Lisbon, Portugal, in October–November 2000, to establish the World Ocean Database Project. This project is intended to stimulate international exchange of modern oceanographic data and encourage the development of regional oceanographic databases as well as the implementation of regional quality control procedures. This new Project was endorsed by the IODE at the conclusion of the Portugal meeting, and the IOC subsequently approved this project in June 2001.

Data assimilation is a mathematical discipline that seeks to optimally combine theory with observations. There may be a number of different goals sought – for example, to determine the optimal state estimate of a system, to determine initial conditions for a numerical forecast model, to interpolate sparse observation data using knowledge of the system being observed, to set numerical parameters based on training a model from observed data. Depending on the goal, different solution methods may be used. Data assimilation is distinguished from other forms of machine learning, image analysis, and statistical methods in that it utilizes a dynamical model of the system being analyzed.

Tropical cyclone forecast model Computer program that uses meteorological data to forecast tropical cyclones

A tropical cyclone forecast model is a computer program that uses meteorological data to forecast aspects of the future state of tropical cyclones. There are three types of models: statistical, dynamical, or combined statistical-dynamic. Dynamical models utilize powerful supercomputers with sophisticated mathematical modeling software and meteorological data to calculate future weather conditions. Statistical models forecast the evolution of a tropical cyclone in a simpler manner, by extrapolating from historical datasets, and thus can be run quickly on platforms such as personal computers. Statistical-dynamical models use aspects of both types of forecasting. Four primary types of forecasts exist for tropical cyclones: track, intensity, storm surge, and rainfall. Dynamical models were not developed until the 1970s and the 1980s, with earlier efforts focused on the storm surge problem.

The World Ocean Circulation Experiment (WOCE) was a component of the international World Climate Research Program, and aimed to establish the role of the World Ocean in the Earth's climate system. WOCE's field phase ran between 1990 and 1998, and was followed by an analysis and modeling phase that ran until 2002. When the WOCE was conceived, there were three main motivations for its creation. The first of these is the inadequate coverage of the World Ocean, specifically in the Southern Hemisphere. Data was also much more sparse during the winter months than the summer months, and there was—and still to some extent—a critical need for data covering all seasons. Secondly, the data that did exist was not initially collected for studying ocean circulation and was not well suited for model comparison. Lastly, there were concerns involving the accuracy and reliability of some measurements. The WOCE was meant to address these problems by providing new data collected in ways designed to "meet the needs of global circulation models for climate prediction."

The World Ocean Atlas (WOA) is a data product of the Ocean Climate Laboratory of the National Oceanographic Data Center (U.S.). The WOA consists of a climatology of fields of in situ ocean properties for the World Ocean. It was first produced in 1994, with later editions at roughly four year intervals in 1998, 2001, 2005, 2009, and 2013.

The Global Ocean Data Analysis Project (GLODAP) is a synthesis project bringing together oceanographic data, featuring two major releases as of 2018. The central goal of GLODAP is to generate a global climatology of the World Ocean's carbon cycle for use in studies of both its natural and anthropogenically forced states. GLODAP is funded by the National Oceanic and Atmospheric Administration, the U.S. Department of Energy, and the National Science Foundation.

The ensemble Kalman filter (EnKF) is a recursive filter suitable for problems with a large number of variables, such as discretizations of partial differential equations in geophysical models. The EnKF originated as a version of the Kalman filter for large problems, and it is now an important data assimilation component of ensemble forecasting. EnKF is related to the particle filter but the EnKF makes the assumption that all probability distributions involved are Gaussian; when it is applicable, it is much more efficient than the particle filter.

Backtesting is a term used in modeling to refer to testing a predictive model on historical data. Backtesting is a type of retrodiction, and a special type of cross-validation applied to previous time period(s).

Ecological forecasting uses knowledge of physics, ecology and physiology to predict how ecological populations, communities, or ecosystems will change in the future in response to environmental factors such as climate change. The goal of the approach is to provide natural resource managers with information to anticipate and respond to short and long-term climate conditions.

Wildfire modeling

In computational science, wildfire modeling is concerned with numerical simulation of wildland fires in order to understand and predict fire behavior. Wildfire modeling can ultimately aid wildland fire suppression, namely increase safety of firefighters and the public, reduce risk, and minimize damage. Wildfire modeling can also aid in protecting ecosystems, watersheds, and air quality.

Wind wave model Numerical modelling of the sea state

In fluid dynamics, wind wave modeling describes the effort to depict the sea state and predict the evolution of the energy of wind waves using numerical techniques. These simulations consider atmospheric wind forcing, nonlinear wave interactions, and frictional dissipation, and they output statistics describing wave heights, periods, and propagation directions for regional seas or global oceans. Such wave hindcasts and wave forecasts are extremely important for commercial interests on the high seas. For example, the shipping industry requires guidance for operational planning and tactical seakeeping purposes.

An atmospheric reanalysis is a meteorological and climate data assimilation project which aims to assimilate historical atmospheric observational data spanning an extended period, using a single consistent assimilation scheme throughout.

Eugenia Kalnay Argentine meteorologist

Eugenia Enriqueta Kálnay de Rivas is an Argentine meteorologist and a Distinguished University Professor of Atmospheric and Oceanic Science, which is part of the University of Maryland College of Computer, Mathematical, and Natural Sciences at the University of Maryland, College Park in the United States.

Moving horizon estimation (MHE) is an optimization approach that uses a series of measurements observed over time, containing noise and other inaccuracies, and produces estimates of unknown variables or parameters. Unlike deterministic approaches, MHE requires an iterative approach that relies on linear programming or nonlinear programming solvers to find a solution.

The neutral density or empirical neutral density is a density variable used in oceanography, introduced in 1997 by David R. Jackett and Trevor McDougall. It is a function of the three state variables and the geographical location. It has the typical units of density (M/V). Isosurfaces of form “neutral density surfaces”, which are closely aligned with the "neutral tangent plane". It is widely believed, although this has yet to be rigorously proven, that the flow in the deep ocean is almost entirely aligned with the neutral tangent plane, and strong lateral mixing occurs along this plane vs weak mixing across this plane . These surfaces are widely used in water mass analyses. Neutral density is a density variable that depends on the particular state of the ocean, and hence is also a function of time, though this is often ignored. In practice, its construction from a given hydrographic dataset is achieved by means of a computational code, that contains the computational algorithm developed by Jackett and McDougall. Use of this code is currently restricted to the present day ocean.

The Simple Ocean Data Assimilation (SODA) analysis is an oceanic reanalysis data set consisting of gridded state variables for the global ocean, as well as several derived fields. SODA was developed in the 1990s as a collaborative project between the Department of Atmospheric and Oceanic Science at the University of Maryland and the Department of Oceanography at Texas A&M University with the goal of providing an improved estimate of ocean state from those based solely on observations or numerical simulations. Since its first release there have been several updates, the most recent of which extends from 1958-2008, as well as a “beta release” of a long-term reanalysis for 1871-2008.

DIVA allows the spatial interpolation/gridding of data (analysis) in an optimal way, comparable to optimal interpolation (OI), taking into account uncertainties on observations. In comparison to standard OI, used in Data assimilation, DIVA, when applied to ocean data, takes into account coastlines, sub-basins and advection because of its variational formulation on the real domain. Calculations are highly optimized and rely on a finite element resolution. Tools to generate the finite element mesh are provided as well as tools to optimize the parameters of the analysis. Quality control of data can be performed and error fields can be calculated. Also detrending of data is possible. Finally 3D and 4D extensions are included with emphasis on direct computations of climatologies from ODV spreadsheet files.

References

  1. 1 2 Carton, J.A., and A. Santorelli, 2008: Global upper ocean heat content as viewed in nine analyses, J. Clim., 21, 6015–6035.
  2. Hunt, B.R., Kostelich E.J., Szunyogh, I. Efficient Data Assimilation for Spatiotemporal Chaos: A Local Ensemble Transform Kalman Filter. arXiv:physics/0511236 v1 28 Nov 2005. Dated May 24, 2006.