QuakeSim is a NASA project for modeling earthquake fault systems. It was started in 2001 with NASA funding as a follow-up to the General Earthquake Models (GEM) initiative. The multi-scale nature of earthquakes requires integrating data types and models to fully simulate and understand the earthquake process. QuakeSim is a computational framework for modeling and understanding earthquake and tectonic processes.
QuakeSim focuses on modeling interseismic process though various boundary element, finite element, and analytical applications, which run on various platforms, including desktop and high-end computers. The QuakeTables database allows for modelers to access geological and geophysical data. A goal of QuakeSim is to develop significant improvements in earthquake forecast quality, thereby mitigating the danger from this natural hazard.
The QuakeSim Portal allows for users to access and ingest data into models and simulations. It provides the computational infrastructure for the entire project. QuakeSim users can create an account and interact with different data and software through the portal. The QuakeSim Portal consists of portlets that include:
QuakeTables is the database used to access information for QuakeSim. Information found in the QuakeTables includes:
This information plays a big role in the process of forecasting and damage mitigation. The information allows for the creation of simulations and data mining. This then improves the prediction for potential earthquakes. This, along with attenuation modeling and site effects, leads to a better understanding of probable ground motion, allowing for the opportunity to improve structural response.
QuakeSim includes several applications. GeoFEST, PARK, and Virtual California are used to model different aspects of the earthquake cycle.
QuakeSim utilizes GPS data from NASA, the National Science Foundation, and the US Geological Survey Southern California Integrated GPS Network (SCIGN). QuakeSim was also establishing the computational infrastructure for the planned NASA DESDynI (Deformation, Ecosystem Structure, and Dynamics of Ice) mission, which was cancelled in 2012. [1] DESDynI (pronounced destiny) would fly InSAR and LIDAR instruments for studying hazards and global environmental change.
The San Andreas Fault is a continental right-lateral strike-slip transform fault that extends roughly 1,200 kilometers (750 mi) through the Californias. It forms the tectonic boundary between the Pacific Plate and the North American Plate. Traditionally, for scientific purposes, the fault has been classified into three main segments, each with different characteristics and a different degree of earthquake risk. The average slip rate along the entire fault ranges from 20 to 35 mm per year.
Computer simulation is the process of mathematical modelling, performed on a computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they aim to predict. Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics, astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in economics, psychology, social science, health care and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions.
The Gravity Recovery and Climate Experiment (GRACE) was a joint mission of NASA and the German Aerospace Center (DLR). Twin satellites took detailed measurements of Earth's gravity field anomalies from its launch in March 2002 to the end of its science mission in October 2017. The GRACE Follow-On (GRACE-FO) is a continuation of the mission on near-identical hardware, launched in May 2018.
A physics engine is computer software that provides an approximate simulation of certain physical systems, such as rigid body dynamics, soft body dynamics, and fluid dynamics, of use in the domains of computer graphics, video games and film (CGI). Their main uses are in video games, in which case the simulations are in real-time. The term is sometimes used more generally to describe any software system for simulating physical phenomena, such as high-performance scientific simulation.
Seasat was the first Earth-orbiting satellite designed for remote sensing of the Earth's oceans and had on board one of the first spaceborne synthetic-aperture radar (SAR). The mission was designed to demonstrate the feasibility of global satellite monitoring of oceanographic phenomena and to help determine the requirements for an operational ocean remote sensing satellite system. Specific objectives were to collect data on sea-surface winds, sea-surface temperatures, wave heights, internal waves, atmospheric water, sea ice features and ocean topography. Seasat was managed by NASA's Jet Propulsion Laboratory and was launched on 27 June 1978 into a nearly circular 800 km (500 mi) orbit with an inclination of 108°. Seasat operated until 10 October 1978 (UTC), when a massive short circuit in the Agena-D bus electrical system ended the mission.
The EarthScope project was an National Science Foundation (NSF) funded earth science program that, from 2003-2018, used geological and geophysical techniques to explore the structure and evolution of the North American continent and to understand the processes controlling earthquakes and volcanoes. The project had three components: USArray, the Plate Boundary Observatory, and the San Andreas Fault Observatory at Depth. Organizations associated with the project included UNAVCO, the Incorporated Research Institutions for Seismology (IRIS), Stanford University, the United States Geological Survey (USGS) and National Aeronautics and Space Administration (NASA). Several international organizations also contributed to the initiative. EarthScope data are publicly accessible.
The Plate Boundary Observatory (PBO) was the geodetic component of the EarthScope Facility. EarthScope was an earth science program that explored the 4-dimensional structure of the North American Continent. EarthScope was a 15-year project (2003-2018) funded by the National Science Foundation (NSF) in conjunction with NASA. PBO construction took place from October 2003 through September 2008. Phase 1 of operations and maintenance concluded in September 2013. Phase 2 of operations ended in September 2018, along with the end of the EarthScope project. In October 2018, PBO was assimilated into a broader Network of the Americas (NOTA), along with networks in Mexico (TLALOCNet) and the Caribbean (COCONet), as part of the NSF's Geodetic Facility for the Advancement of Geosciences (GAGE). GAGE is operated by UNAVCO.
Interferometric synthetic aperture radar, abbreviated InSAR, is a radar technique used in geodesy and remote sensing. This geodetic method uses two or more synthetic aperture radar (SAR) images to generate maps of surface deformation or digital elevation, using differences in the phase of the waves returning to the satellite or aircraft. The technique can potentially measure millimetre-scale changes in deformation over spans of days to years. It has applications for geophysical monitoring of natural hazards, for example earthquakes, volcanoes and landslides, and in structural engineering, in particular monitoring of subsidence and structural stability.
Earthquake forecasting is a branch of the science of seismology concerned with the probabilistic assessment of general earthquake seismic hazard, including the frequency and magnitude of damaging earthquakes in a given area over years or decades. While forecasting is usually considered to be a type of prediction, earthquake forecasting is often differentiated from earthquake prediction, whose goal is the specification of the time, location, and magnitude of future earthquakes with sufficient precision that a warning can be issued. Both forecasting and prediction of earthquakes are distinguished from earthquake warning systems, which, upon detection of an earthquake, provide a real-time warning to regions that might be affected.
Specialized wind energy software applications aid in the development and operation of wind farms.
Soil Moisture Active Passive (SMAP) is a NASA environmental monitoring satellite that measures soil moisture across the planet. It is designed to collect a global 'snapshot' of soil moisture every 2 to 3 days. With this frequency, changes from specific storms can be measured while also assessing impacts across seasons of the year. SMAP was launched on 31 January 2015. It was one of the first Earth observation satellites developed by NASA in response to the National Research Council's Decadal Survey.
UNAVCO is a non-profit university-governed consortium that facilitates geology research and education using geodesy.
The NASA-ISRO Synthetic Aperture Radar (NISAR) mission is a joint project between NASA and ISRO to co-develop and launch a dual-frequency synthetic aperture radar on an Earth observation satellite. The satellite will be the first radar imaging satellite to use dual frequencies. It will be used for remote sensing, to observe and understand natural processes on Earth. For example, its left-facing instruments will study the Antarctic cryosphere. With a total cost estimated at US$1.5 billion, NISAR is likely to be the world's most expensive Earth-imaging satellite.
VERITAS is an upcoming mission from NASA's Jet Propulsion Laboratory (JPL) to map the surface of the planet Venus in high resolution. The combination of topography, near-infrared spectroscopy, and radar image data will provide knowledge of Venus's tectonic and impact history, gravity, geochemistry, the timing and mechanisms of volcanic resurfacing, and the mantle processes responsible for them.
The 2015 Uniform California Earthquake Rupture Forecast, Version 3, or UCERF3, is the latest official earthquake rupture forecast (ERF) for the state of California, superseding UCERF2. It provides authoritative estimates of the likelihood and severity of potentially damaging earthquake ruptures in the long- and near-term. Combining this with ground motion models produces estimates of the severity of ground shaking that can be expected during a given period, and of the threat to the built environment. This information is used to inform engineering design and building codes, planning for disaster, and evaluating whether earthquake insurance premiums are sufficient for the prospective losses. A variety of hazard metrics can be calculated with UCERF3; a typical metric is the likelihood of a magnitude M 6.7 earthquake in the 30 years since 2014.
Remote sensing in geology is remote sensing used in the geological sciences as a data acquisition method complementary to field observation, because it allows mapping of geological characteristics of regions without physical contact with the areas being explored. About one-fourth of the Earth's total surface area is exposed land where information is ready to be extracted from detailed earth observation via remote sensing. Remote sensing is conducted via detection of electromagnetic radiation by sensors. The radiation can be naturally sourced, or produced by machines and reflected off of the Earth surface. The electromagnetic radiation acts as an information carrier for two main variables. First, the intensities of reflectance at different wavelengths are detected, and plotted on a spectral reflectance curve. This spectral fingerprint is governed by the physio-chemical properties of the surface of the target object and therefore helps mineral identification and hence geological mapping, for example by hyperspectral imaging. Second, the two-way travel time of radiation from and back to the sensor can calculate the distance in active remote sensing systems, for example, Interferometric synthetic-aperture radar. This helps geomorphological studies of ground motion, and thus can illuminate deformations associated with landslides, earthquakes, etc.
Nadia Lapusta is a Professor of Mechanical Engineering and Geophysics at the California Institute of Technology. She designed the first computational model that could accurately and efficiently simulate sequence of earthquakes and interseismic slow deformation on a planar fault in a single consistent physical framework.
The earthquake cycle refers to the phenomenon that earthquakes repeatedly occur on the same fault as the result of continual stress accumulation and periodic stress release. Earthquake cycles can occur on a variety of faults including subduction zones and continental faults. Depending on the size of the earthquake, an earthquake cycle can last decades, centuries, or longer. The Parkfield portion of the San Andreas fault is a well-known example where similarly located M6.0 earthquakes have been instrumentally recorded every 30–40 years.
Atmospheric correction for Interferometric Synthetic ApertureRadar (InSAR) technique is a set of different methods to remove artefact displacement from an interferogram caused by the effect of weather variables such as humidity, temperature, and pressure. An interferogram is generated by processing two synthetic-aperture radar images before and after a geophysical event like an earthquake. Corrections for atmospheric variations are an important stage of InSAR data processing in many study areas to measure surface displacement because relative humidity differences of 20% can cause inaccuracies of 10–14 cm InSAR due to varying delays in the radar signal. Overall, atmospheric correction methods can be divided into two categories: a) Using Atmospheric Phase Screen (APS) statistical properties and b) Using auxiliary (external) data such as GPS measurements, multi-spectral observations, local meteorological models, and global atmospheric models.