The ECMWF reanalysis project is a meteorological reanalysis project carried out by the European Centre for Medium-Range Weather Forecasts (ECMWF). The first reanalysis product, ERA-15, generated reanalyses for approximately 15 years, from December 1978 to February 1994. The second product, ERA-40 (originally intended as a 40-year reanalysis) begins in 1957 (the International Geophysical Year) and covers 45 years to 2002. As a precursor to a revised extended reanalysis product to replace ERA-40, ECMWF released ERA-Interim, which covers the period from 1979 to 2019. A new reanalysis product ERA5 has more recently been released by ECMWF as part of Copernicus Climate Change Services. This product has higher spatial resolution (31 km) and covers the period from 1979 to present. Extension back to 1940 became available in 2023. [1]
In addition to reanalysing all the old data using a consistent system, the reanalyses also make use of much archived data that was not available to the original analyses. This allows for the correction of many historical hand-drawn maps where the estimation of features was common in areas of data sparsity. The ability is also present to create new maps of atmosphere levels that were not commonly used until more recent times.
Many sources of the meteorological observations were used, including radiosondes, balloons, aircraft, buoyes, satellites, scatterometers. This data was run through the ECMWF computer model at a 125 km resolution. [2] As the ECMWF's computer model is one of the more highly regarded in the field of forecasting, many scientists take its reanalysis to have similar merit. The data is stored in GRIB format. The reanalysis was done in an effort to improve the accuracy of historical weather maps and aid in a more detailed analysis of various weather systems through a period that was severely lacking in computerized data. With the data from reanalyses such as this, many of the more modern computerized tools for analyzing storm systems can be utilized, at least in part, because of this access to a computerized simulation of the atmospheric state.
The ECMWF re-analysis products are accessible from the Climate Change Services homepage. [3] The data can be downloaded for research use from ECMWF's homepage (see external links) and the National Center for Atmospheric Research data archives. Both require registration. A Python web API can be used to download a subset of parameters for a selected region and time period.
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organisation supported by most of the nations of Europe. It is based at three sites: Shinfield Park, Reading, United Kingdom; Bologna, Italy; and Bonn, Germany. It operates one of the largest supercomputer complexes in Europe and the world's largest archive of numerical weather prediction data.
The National Weather Service (NWS) is an agency of the United States federal government that is tasked with providing weather forecasts, warnings of hazardous weather, and other weather-related products to organizations and the public for the purposes of protection, safety, and general information. It is a part of the National Oceanic and Atmospheric Administration (NOAA) branch of the Department of Commerce, and is headquartered in Silver Spring, Maryland, within the Washington metropolitan area. The agency was known as the United States Weather Bureau from 1891 until it adopted its current name in 1970.
The Meteorological Office, abbreviated as the Met Office, is the United Kingdom's national weather and climate service. It is an executive agency and trading fund of the Department for Science, Innovation and Technology and is led by CEO Penelope Endersby, who took on the role as Chief Executive in December 2018 and is the first woman to do so. The Met Office makes meteorological predictions across all timescales from weather forecasts to climate change.
The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) is an intergovernmental organisation created through an international convention agreed by a current total of 30 European Member States.
Numerical weather prediction (NWP) uses mathematical models of the atmosphere and oceans to predict the weather based on current weather conditions. Though first attempted in the 1920s, it was not until the advent of computer simulation in the 1950s that numerical weather predictions produced realistic results. A number of global and regional forecast models are run in different countries worldwide, using current weather observations relayed from radiosondes, weather satellites and other observing systems as inputs.
Ensemble forecasting is a method used in or within numerical weather prediction. Instead of making a single forecast of the most likely weather, a set of forecasts is produced. This set of forecasts aims to give an indication of the range of possible future states of the atmosphere.
In atmospheric science, an atmospheric model is a mathematical model constructed around the full set of primitive, dynamical equations which govern atmospheric motions. It can supplement these equations with parameterizations for turbulent diffusion, radiation, moist processes, heat exchange, soil, vegetation, surface water, the kinematic effects of terrain, and convection. Most atmospheric models are numerical, i.e. they discretize equations of motion. They can predict microscale phenomena such as tornadoes and boundary layer eddies, sub-microscale turbulent flow over buildings, as well as synoptic and global flows. The horizontal domain of a model is either global, covering the entire Earth, or regional (limited-area), covering only part of the Earth. Atmospheric models also differ in how they compute vertical fluid motions; some types of models are thermotropic, barotropic, hydrostatic, and non-hydrostatic. These model types are differentiated by their assumptions about the atmosphere, which must balance computational speed with the model's fidelity to the atmosphere it is simulating.
The Atlantic hurricane reanalysis project of the National Oceanic and Atmospheric Administration seeks to correct and add new information about past North Atlantic hurricanes. It was started around 2000 to update HURDAT, the official hurricane database for the Atlantic Basin, which has become outdated since its creation due to various systematic errors introduced into the database over time. This effort has involved reanalyses of ship observations from the International Comprehensive Ocean-Atmosphere Data Set (ICOADS) as well as reanalyses done by other researchers over the years. It is ongoing as of 2024.
Backtesting is a term used in modeling to refer to testing a predictive model on historical data. Backtesting is a type of retrodiction, and a special type of cross-validation applied to previous time period(s).
The NCEP/NCAR Reanalysis is an atmospheric reanalysis produced by the National Centers for Environmental Prediction (NCEP) and the National Center for Atmospheric Research (NCAR). It is a continually updated globally gridded data set that represents the state of the Earth's atmosphere, incorporating observations and numerical weather prediction (NWP) model output from 1948 to present.
The following outline is provided as an overview of and topical guide to the field of Meteorology.
Jagadish Shukla is an Indian meteorologist and Distinguished University Professor at George Mason University in the United States.
Computational geophysics is the field of study that uses any type of numerical computations to generate and analyze models of complex geophysical systems. It can be considered an extension, or sub-field, of both computational physics and geophysics. In recent years, computational power, data availability, and modelling capabilities have all improved exponentially, making computational geophysics a more populated discipline. Due to the large computational size of many geophysical problems, high-performance computing can be required to handle analysis. Modeling applications of computational geophysics include atmospheric modelling, oceanic modelling, general circulation models, and geological modelling. In addition to modelling, some problems in remote sensing fall within the scope of computational geophysics such as tomography, inverse problems, and 3D reconstruction.
An atmospheric reanalysis is a meteorological and climate data assimilation project which aims to assimilate historical atmospheric observational data spanning an extended period, using a single consistent assimilation scheme throughout.
Masao Kanamitsu was a Japanese and American atmospheric scientist working in the field of data assimilation. His research greatly influenced global and regional climate change studies including development of breakthrough reanalysis and downscaling datasets and weather forecasting studies. He was the co-author of one of the most cited geophysics paper in his time.
The Simple Ocean Data Assimilation (SODA) analysis is an oceanic reanalysis data set consisting of gridded state variables for the global ocean, as well as several derived fields. SODA was developed in the 1990s as a collaborative project between the Department of Atmospheric and Oceanic Science at the University of Maryland and the Department of Oceanography at Texas A&M University with the goal of providing an improved estimate of ocean state from those based solely on observations or numerical simulations. Since its first release there have been several updates, the most recent of which extends from 1958 to 2008, as well as a “beta release” of a long-term reanalysis for 1871–2008.
The North American Ensemble Forecast System (NAEFS) is a joint project involving the Meteorological Service of Canada (MSC) in Canada, the National Weather Service (NWS) in the United States, and the National Meteorological Service of Mexico (NMSM) in Mexico providing numerical weather prediction ensemble guidance for the 1- to 16-day forecast period. The NAEFS combines the Canadian MSC and the US NWS global ensemble prediction systems, improving probabilistic operational guidance over what can be built from any individual country's ensemble. Model guidance from the NAEFS is incorporated into the forecasts of the respective national agencies.
The Copernicus Atmosphere Monitoring Service (CAMS) is a service implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF), launched in November 11, 2014, that provides continuous data and information on atmospheric composition. CAMS, which is part of the Copernicus Programme, describes the current situation, forecasts the situation a few days ahead, and analyses consistently retrospective data records for recent years. This service has around 10 years of developments, and its current precursor project, MACC-III, is delivering the pre-operational Copernicus Atmosphere Service. CAMS tracks air pollution, solar energy, greenhouse gases and climate forcing globally.
Florence Rabier is a French meteorologist who is Director-General of the European Centre for Medium-Range Weather Forecasts. She works on numerical weather prediction. She was appointed a Knight of the Legion of Honour in 2014.
Atmospheric correction for Interferometric Synthetic ApertureRadar (InSAR) technique is a set of different methods to remove artefact displacement from an interferogram caused by the effect of weather variables such as humidity, temperature, and pressure. An interferogram is generated by processing two synthetic-aperture radar images before and after a geophysical event like an earthquake. Corrections for atmospheric variations are an important stage of InSAR data processing in many study areas to measure surface displacement because relative humidity differences of 20% can cause inaccuracies of 10–14 cm InSAR due to varying delays in the radar signal. Overall, atmospheric correction methods can be divided into two categories: a) Using Atmospheric Phase Screen (APS) statistical properties and b) Using auxiliary (external) data such as GPS measurements, multi-spectral observations, local meteorological models, and global atmospheric models.