A one-hundred-year flood is a flood event that has a 1 in 100 chance (1% probability) of being equaled or exceeded in any given year.
The 100-year flood is also referred to as the 1% flood, since its annual exceedance probability is 1%.For coastal or lake flooding, the 100-year flood is generally expressed as a flood elevation or depth, and may include wave effects. For river systems, the 100-year flood is generally expressed as a flowrate. Based on the expected 100-year flood flow rate, the flood water level can be mapped as an area of inundation. The resulting floodplain map is referred to as the 100-year floodplain. Estimates of the 100-year flood flowrate and other streamflow statistics for any stream in the United States are available. In the UK The Environment Agency publishes a comprehensive map of all areas at risk of a 1 in 100 year flood. Areas near the coast of an ocean or large lake also can be flooded by combinations of tide, storm surge, and waves. Maps of the riverine or coastal 100-year floodplain may figure importantly in building permits, environmental regulations, and flood insurance.
A common misunderstanding is that a 100-year flood is likely to occur only once in a 100-year period. In fact, there is approximately a 63.4% chance of one or more 100-year floods occurring in any 100-year period. On the Danube River at Passau, Germany, the actual intervals between 100-year floods during 1501 to 2013 ranged from 37 to 192 years.The probability Pe that one or more floods occurring during any period will exceed a given flood threshold can be expressed, using the binomial distribution, as
where T is the threshold return period (e.g. 100-yr, 50-yr, 25-yr, and so forth), and n is the number of years in the period. The probability of exceedance Pe is also described as the natural, inherent, or hydrologic risk of failure.However, the expected value of the number of 100-year floods occurring in any 100-year period is 1.
Ten-year floods have a 10% chance of occurring in any given year (Pe =0.10); 500-year have a 0.2% chance of occurring in any given year (Pe =0.002); etc. The percent chance of an X-year flood occurring in a single year is 100/X. A similar analysis is commonly applied to coastal flooding or rainfall data. The recurrence interval of a storm is rarely identical to that of an associated riverine flood, because of rainfall timing and location variations among different drainage basins.
The field of extreme value theory was created to model rare events such as 100-year floods for the purposes of civil engineering. This theory is most commonly applied to the maximum or minimum observed stream flows of a given river. In desert areas where there are only ephemeral washes, this method is applied to the maximum observed rainfall over a given period of time (24-hours, 6-hours, or 3-hours). The extreme value analysis only considers the most extreme event observed in a given year. So, between the large spring runoff and a heavy summer rain storm, whichever resulted in more runoff would be considered the extreme event, while the smaller event would be ignored in the analysis (even though both may have been capable of causing terrible flooding in their own right).
There are a number of assumptions that are made to complete the analysis that determines the 100-year flood. First, the extreme events observed in each year must be independent from year to year. In other words, the maximum river flow rate from 1984 cannot be found to be significantly correlated with the observed flow rate in 1985, which cannot be correlated with 1986, and so forth. The second assumption is that the observed extreme events must come from the same probability distribution function. The third assumption is that the probability distribution relates to the largest storm (rainfall or river flow rate measurement) that occurs in any one year. The fourth assumption is that the probability distribution function is stationary, meaning that the mean (average), standard deviation and maximum and minimum values are not increasing or decreasing over time. This concept is referred to as stationarity.
The first assumption is often but not always valid and should be tested on a case by case basis. The second assumption is often valid if the extreme events are observed under similar climate conditions. For example, if the extreme events on record all come from late summer thunderstorms (as is the case in the southwest U.S.), or from snow pack melting (as is the case in north-central U.S.), then this assumption should be valid. If, however, there are some extreme events taken from thunder storms, others from snow pack melting, and others from hurricanes, then this assumption is most likely not valid. The third assumption is only a problem when trying to forecast a low, but maximum flow event (for example, an event smaller than a 2-year flood). Since this is not typically a goal in extreme analysis, or in civil engineering design, then the situation rarely presents itself. The final assumption about stationarity is difficult to test from data for a single site because of the large uncertainties in even the longest flood records(see next section). More broadly, substantial evidence of climate change strongly suggests that the probability distribution is also changing and that managing flood risks in the future will become even more difficult. The simplest implication of this is that not all of the historical data are, or can be, considered valid as input into the extreme event analysis.
When these assumptions are violated there is an unknown amount of uncertainty introduced into the reported value of what the 100-year flood means in terms of rainfall intensity, or flood depth. When all of the inputs are known the uncertainty can be measured in the form of a confidence interval. For example, one might say there is a 95% chance that the 100-year flood is greater than X, but less than Y.
Direct statistical analysis 3,800,000 square miles (9,800,000 km2), so there are perhaps 3,000 stream reaches that drain watersheds of 1,000 square miles (2,600 km2) and 300,000 reaches that drain 10 square miles (26 km2). In urban areas, 100-year flood estimates are needed for watersheds as small as 1 square mile (2.6 km2). For reaches without sufficient data for direct analysis, 100-year flood estimates are derived from indirect statistical analysis of flood records at other locations in a hydrologically similar region or from other hydrologic models. Similarly for coastal floods, tide gauge data exist for only about 1,450 sites worldwide, of which only about 950 added information to the global data center between January 2010 and March 2016.to estimate the 100-year riverine flood is possible only at the relatively few locations where an annual series of maximum instantaneous flood discharges has been recorded. In the United States as of 2014, taxpayers have supported such records for at least 60 years at fewer than 2,600 locations, for at least 90 years at fewer than 500, and for at least 120 years at only 11. For comparison, the total area of the nation is about
Much longer records of flood elevations exist at a few locations around the world, such as the Danube River at Passau, Germany, but they must be evaluated carefully for accuracy and completeness before any statistical interpretation.
For an individual stream reach, the uncertainties in any analysis can be large, so 100-year flood estimates have large individual uncertainties for most stream reaches. 24 For the largest recorded flood at any specific location, or any potentially larger event, the recurrence interval always is poorly known. :20,24 Spatial variability adds more uncertainty, because a flood peak observed at different locations on the same stream during the same event commonly represents a different recurrence interval at each location. :20 If an extreme storm drops enough rain on one branch of a river to cause a 100-year flood, but no rain falls over another branch, the flood wave downstream from their junction might have a recurrence interval of only 10 years. Conversely, a storm that produces a 25-year flood simultaneously in each branch might form a 100-year flood downstream. During a time of flooding, news accounts necessarily simplify the story by reporting the greatest damage and largest recurrence interval estimated at any location. The public can easily and incorrectly conclude that the recurrence interval applies to all stream reaches in the flood area. :7,24:
Peak elevations of 14 floods as early as 1501 on the Danube River at Passau, Germany, reveal great variability in the actual intervals between floods. 16–19 Flood events greater than the 50-year flood occurred at intervals of 4 to 192 years since 1501, and the 50-year flood of 2002 was followed only 11 years later by a 500-year flood. Only half of the intervals between 50- and 100-year floods were within 50 percent of the nominal average interval. Similarly, the intervals between 5-year floods during 1955 to 2007 ranged from 5 months to 16 years, and only half were within 2.5 to 7.5 years.:
In the United States, the 100-year flood provides the risk basis for flood insurance rates. Complete information on the National Flood Insurance Program (NFIP) is available here. A regulatory flood or base flood is routinely established for river reaches through a science-based rule-making process targeted to a 100-year flood at the historical average recurrence interval. In addition to historical flood data, the process accounts for previously established regulatory values, the effects of flood-control reservoirs, and changes in land use in the watershed. Coastal flood hazards have been mapped by a similar approach that includes the relevant physical processes. Most areas where serious floods can occur in the United States have been mapped consistently in this manner. On average nationwide, those 100-year flood estimates are well sufficient for the purposes of the NFIP and offer reasonable estimates of future flood risk, if the future is like the past. 24 Approximately 3% of the U.S. population lives in areas subject to the 1% annual chance coastal flood hazard.:
In theory, removing homes and businesses from areas that flood repeatedly can protect people and reduce insurance losses, but in practice it is difficult for people to retreat from established neighborhoods.
A flood is an overflow of water that submerges land that is usually dry. In the sense of "flowing water", the word may also be applied to the inflow of the tide. Floods are an area of study of the discipline hydrology and are of significant concern in agriculture, civil engineering and public health.
In survival analysis, the hazard ratio (HR) is the ratio of the hazard rates corresponding to the conditions described by two levels of an explanatory variable. For example, in a drug study, the treated population may die at twice the rate per unit time as the control population. The hazard ratio would be 2, indicating higher hazard of death from the treatment. Or in another study, men receiving the same treatment may suffer a certain complication ten times more frequently per unit time than women, giving a hazard ratio of 10.
In statistics, a confidence interval (CI) is a type of estimate computed from the statistics of the observed data. This proposes a range of plausible values for an unknown parameter. The interval has an associated confidence level that the true parameter is in the proposed range. This is more clearly stated as: the confidence level represents the probability that the unknown parameter lies in the stated interval. The level of confidence can be chosen by the investigator. In general terms, a confidence interval for an unknown parameter is based on sampling the distribution of a corresponding estimator.
In statistical inference, specifically predictive inference, a prediction interval is an estimate of an interval in which a future observation will fall, with a certain probability, given what has already been observed. Prediction intervals are often used in regression analysis.
A hydrograph is a graph showing the rate of flow (discharge) versus time past a specific point in a river, channel, or conduit carrying flow. The rate of flow is typically expressed in cubic meters or cubic feet per second . It can also refer to a graph showing the volume of water reaching a particular outfall, or location in a sewerage network. Graphs are commonly used in the design of sewerage, more specifically, the design of surface water sewerage systems and combined sewers.
A return period, also known as a recurrence interval or repeat interval, is an average time or an estimated average time between events such as earthquakes, floods, landslides, or a river discharge flows to occur.
Probabilistic risk assessment (PRA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity or the effects of stressors on the environment for example.
WASH-1400, 'The Reactor Safety Study', was a report produced in 1975 for the Nuclear Regulatory Commission by a committee of specialists under Professor Norman Rasmussen. It "generated a storm of criticism in the years following its release". In the years immediately after its release, WASH-1400 was followed by a number of reports that either peer reviewed its methodology or offered their own judgments about probabilities and consequences of various events at commercial reactors. In at least a few instances, some offered critiques of the study's assumptions, methodology, calculations, peer review procedures, and objectivity. A succession of reports, including NUREG-1150, the State-of-the-Art Reactor Consequence Analyses and others, have carried-on the tradition of PRA and its application to commercial power plants.
The May 1995 Louisiana flood, also known as the May 1995 Southeast Louisiana and Southern Mississippi Flood, was a heavy rainfall event which occurred across an area stretching from the New Orleans metropolitan area into southern Mississippi. A storm total rainfall maximum of 27.5 inches (700 mm) was recorded near Necaise, Mississippi. Considerable flooding was caused by the rainfall including several record flood crests along impacted river systems.
Streamflow, or channel runoff, is the flow of water in streams, rivers, and other channels, and is a major element of the water cycle. It is one component of the runoff of water from the land to waterbodies, the other component being surface runoff. Water flowing in channels comes from surface runoff from adjacent hillslopes, from groundwater flow out of the ground, and from water discharged from pipes. The discharge of water flowing in a channel is measured using stream gauges or can be estimated by the Manning equation. The record of flow over time is called a hydrograph. Flooding occurs when the volume of water exceeds the capacity of the channel.
The history of flooding in Canada includes floods caused by snowmelt runoff or freshet flooding, storm-rainfall and "flash flooding", ice jams during ice formation and spring break-up, natural dams, coastal flooding on ocean or lake coasts from storm surges, hurricanes and tsunamis. Urban flooding can be caused by stormwater runoff, riverine flooding and structural failure when engineered flood management structures, including dams and levees, prove inadequate to manage the quantities and force of flood waters. Floods can also occur when groundwater levels rise entering buildings cracks in foundation, floors and basements.(Sandink, 2010 & 7). Flooding is part of the natural environmental process. Flooding along large river systems is more frequent in spring where peak flows are often governed by runoff volume due to rainfall and snowmelt, but can take place in summer with flash floods in urban systems that respond to short-duration, heavy rainfall. Flooding due to hurricanes, or downgraded severe storms, is a concern from August to October when tropical storms can affect Eastern North America. Flood events have had a significant effect on various regions of the country. Flooding is the costliest natural disaster for Canadians. Most home insurance claims in Canada deal with water damage due to sewer back-up, not fire.
In economics and finance, a Taleb distribution is the statistical profile of an investment which normally provides a payoff of small positive returns, while carrying a small but significant risk of catastrophic losses. The term was coined by journalist Martin Wolf and economist John Kay to describe investments with a "high probability of a modest gain and a low probability of huge losses in any period."
The San Jacinto Fault Zone (SJFZ) is a major strike-slip fault zone that runs through San Bernardino, Riverside, San Diego, and Imperial Counties in Southern California. The SJFZ is a component of the larger San Andreas transform system and is considered to be the most seismically active fault zone in the area. Together they relieve the majority of the stress between the Pacific and North American tectonic plates.
An ARkStorm is a hypothetical but scientifically realistic "megastorm" scenario developed and published by the Multi Hazards Demonstration Project (MHDP) of the United States Geological Survey, based on historical occurrences. It describes an extreme storm that could devastate much of California, causing up to $725 billion in losses, and affect a quarter of California's homes. The event would be similar to exceptionally intense California storms that occurred between December 1861 and January 1862, which dumped nearly 10 feet of rain in parts of California, over a period of 43 days. The name "ARkStorm" means "Atmospheric River (AR) 1,000 (k)" as the storm was originally projected as a 1-in-1000-year event. However, more recent geologic data suggests that the actual frequency of the event is likely in the 100- to 200-year range.
The survival function is a function that gives the probability that a patient, device, or other object of interest will survive beyond any specified time.
The 869 Sanriku earthquake and its associated tsunami struck the area around Sendai in the northern part of Honshu on 9 July 869 AD. The earthquake had an estimated magnitude of at least 8.4 on the moment magnitude scale, but may have been as high as 9.0, similar to the 2011 Tōhoku earthquake and tsunami. The tsunami caused widespread flooding of the Sendai plain. In 2001, researchers identified sand deposits in a trench more than 4.5 kilometres (2.8 mi) from the coast as coming from this tsunami.
A probability box is a characterization of an uncertain number consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.
Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.
Probability distribution fitting or simply distribution fitting is the fitting of a probability distribution to a series of data concerning the repeated measurement of a variable phenomenon.
Cumulative frequency analysis is the analysis of the frequency of occurrence of values of a phenomenon less than a reference value. The phenomenon may be time- or space-dependent. Cumulative frequency is also called frequency of non-exceedance.