Rohn emergency scale

Last updated

The Rohn emergency scale [1] is a scale on which the magnitude (intensity) [2] of an emergency is measured. It was first proposed in 2006, and explained in more detail in a peer-reviewed paper presented at a 2007 system sciences conference. [3] The idea was further refined later that year. [4] The need for such a scale was ratified in two later independent publications. [5] [6] It is the first scale that quantifies any emergency based on a mathematical model. The scale can be tailored for use at any geographic level – city, county, state or continent. It can be used to monitor the development of an ongoing emergency event, as well as forecast the probability and nature of a potential developing emergency and in the planning and execution of a National Response Plan.

Contents

Scales relating to natural phenomena that may result in an emergency are numerous. This section provides a review of several notable emergency related scales. They concentrate mainly on weather and environmental scales that provide a common understanding and lexicon with which to understand the level of intensity and impact of a crisis. Some scales are used before and/or during a crisis to predict the potential intensity and impact of an event and provide an understanding that is useful for preventative and recovery measures. Other scales are used for post-event classification. Most of these scales are descriptive rather than quantitative, which makes them subjective and ambiguous.

1805 Beaufort scale [7]
1931 Modified Mercalli intensity scale [8]
1935 Richter magnitude scale [9] (superseded by the Moment magnitude scale)
1969 Saffir–Simpson scale [10]
1971 Fujita scale [11] (superseded by Enhanced Fujita scale in 2007 [12] )
1982 Volcanic explosivity index
1990 International Nuclear Event Scale [13]
1999 Air quality index [14]

Variables common to all emergencies

According to the Rohn emergency scale, all emergencies can be described by three independent dimensions: (a) scope; (b) topographical change (or lack thereof); and (c) speed of change. The intersection of the three dimensions provides a detailed scale for defining any emergency, [1] as depicted on the Emergency Scale Website. [15]

Scope

The scope of an emergency in the Rohn scale is represented as a continuous variable with a lower limit of zero and a theoretical calculable upper limit. The Rohn Emergency Scale use two parameters that form the scope: percent of affected humans out of the entire population, and damages, or loss, as a percentage of a given gross national product (GNP). Where applied to a specific locality, this parameter may be represented by a gross state product, gross regional product, or any similar measure of economic activity appropriate to the entity under emergency.

Topography

A topographical change means a measurable and noticeable change in land characteristics, in terms of elevation, slope, orientation, and land coverage. These could be either natural (e.g., trees) or artificial (e.g., houses). Non-topographical emergencies are situations where the emergency is non-physical in nature. The collapse of the New York stock market in 1929 is such an example, and the global liquidity crisis of August 2007 [16] is another example. The model treats topographical change as a continuum ranging between 0 and 1 that gives the estimated visual fractional change in the environment.

Speed of change

An emergency is typified by a departure from normal state of affairs. The scale uses the change of the number of victims over time and economical losses over time to calculate a rate of change that is of utmost importance to society (e.g., life and a proxy for quality of life).

Emergency scale mathematical model

The scale is a normalized function whose variables are scope (S), topography (T), and rate of change (D), expressed as

.

These parameters are defined as follows:

Scope

where
where
β is a coefficient which the model creator calculated to be 1.26 ± 0.03,
and
,
where

The model loosely assumes that a society whose majority of the population (70% in this model) is affected and half of its GNP is drained as a result of a calamity reaches a breaking point of disintegration. Sociologists and economists may come up with a better estimate.

Topographical change

or zero for non-topographical events.

Rate of change

and

comprise the rate of change that is of utmost importance to society and therefore incorporated in the model.

Simplified scale for public communications

In some instances, it may be preferable to have an integral scale to more simply and dramatically convey the extent of an emergency, with a range, say, from 1 to 10, and 10 representing the direst emergency. This can be obtained from the function above in any number of ways. One of them is the ceiling function [ clarification needed ]. Another one is a single number representing the volume under the 3D emergency scale.

Related Research Articles

The Beer–Lambert law is commonly applied to chemical analysis measurements to determine the concentration of chemical species that absorb light. It is often referred to as Beer's law. In physics, the Bouguer–Lambert law is an empirical law which relates the extinction or attenuation of light to the properties of the material through which the light is travelling. It had its first use in astronomical extinction. The fundamental law of extinction is sometimes called the Beer–Bouguer–Lambert law or the Bouguer–Beer–Lambert law or merely the extinction law. The extinction law is also used in understanding attenuation in physical optics, for photons, neutrons, or rarefied gases. In mathematical physics, this law arises as a solution of the BGK equation.

The Modified Mercalli intensity scale measures the effects of an earthquake at a given location. This is in contrast with the seismic magnitude usually reported for an earthquake.

<span class="mw-page-title-main">Poynting vector</span> Measure of directional electromagnetic energy flux

In physics, the Poynting vector represents the directional energy flux or power flow of an electromagnetic field. The SI unit of the Poynting vector is the watt per square metre (W/m2); kg/s3 in base SI units. It is named after its discoverer John Henry Poynting who first derived it in 1884. Nikolay Umov is also credited with formulating the concept. Oliver Heaviside also discovered it independently in the more general form that recognises the freedom of adding the curl of an arbitrary vector field to the definition. The Poynting vector is used throughout electromagnetics in conjunction with Poynting's theorem, the continuity equation expressing conservation of electromagnetic energy, to calculate the power flow in electromagnetic fields.

<span class="mw-page-title-main">Imaginary unit</span> Principal square root of −1

The imaginary unit or unit imaginary number is a solution to the quadratic equation x2 + 1 = 0. Although there is no real number with this property, i can be used to extend the real numbers to what are called complex numbers, using addition and multiplication. A simple example of the use of i in a complex number is 2 + 3i.

<span class="mw-page-title-main">Pareto distribution</span> Probability distribution

The Pareto distribution, named after the Italian civil engineer, economist, and sociologist Vilfredo Pareto, is a power-law probability distribution that is used in description of social, quality control, scientific, geophysical, actuarial, and many other types of observable phenomena; the principle originally applied to describing the distribution of wealth in a society, fitting the trend that a large portion of wealth is held by a small fraction of the population. The Pareto principle or "80-20 rule" stating that 80% of outcomes are due to 20% of causes was named in honour of Pareto, but the concepts are distinct, and only Pareto distributions with shape value of log45 ≈ 1.16 precisely reflect it. Empirical observation has shown that this 80-20 distribution fits a wide range of cases, including natural phenomena and human activities.

<span class="mw-page-title-main">Log-normal distribution</span> Probability distribution

In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable that is log-normally distributed takes only positive real values. It is a convenient and useful model for measurements in the natural sciences, engineering, as well as medicine, economics and other fields. It can be applied to diverse quantities such as energies, concentrations, lengths, prices of financial instruments, and other metrics, while acknowledging the inherent uncertainty in all measurements.

<span class="mw-page-title-main">Stirling's approximation</span> Approximation for factorials

In mathematics, Stirling's approximation is an asymptotic approximation for factorials. It is a good approximation, leading to accurate results even for small values of . It is named after James Stirling, though a related but less precise result was first stated by Abraham de Moivre.

The Fujita scale, or Fujita–Pearson scale, is a scale for rating tornado intensity, based primarily on the damage tornadoes inflict on human-built structures and vegetation. The official Fujita scale category is determined by meteorologists and engineers after a ground or aerial damage survey, or both; and depending on the circumstances, ground-swirl patterns, weather radar data, witness testimonies, media reports and damage imagery, as well as photogrammetry or videogrammetry if motion picture recording is available. The Fujita scale was replaced with the Enhanced Fujita scale (EF-Scale) in the United States in February 2007. In April 2013, Canada adopted the EF-Scale over the Fujita scale along with 31 "Specific Damage Indicators" used by Environment Canada (EC) in their ratings.

In musical tuning, the Pythagorean comma (or ditonic comma), named after the ancient mathematician and philosopher Pythagoras, is the small interval (or comma) existing in Pythagorean tuning between two enharmonically equivalent notes such as C and B, or D and C. It is equal to the frequency ratio (1.5)1227 = 531441524288 ≈ 1.01364, or about 23.46 cents, roughly a quarter of a semitone (in between 75:74 and 74:73). The comma that musical temperaments often "temper" is the Pythagorean comma.

The sone is a unit of loudness, the subjective perception of sound pressure. The study of perceived loudness is included in the topic of psychoacoustics and employs methods of psychophysics. Doubling the perceived loudness doubles the sone value. Proposed by Stanley Smith Stevens in 1936, it is not an SI unit.

The Saffir–Simpson hurricane wind scale (SSHWS) classifies hurricanes—which in the Western Hemisphere are tropical cyclones that exceed the intensities of tropical depressions and tropical storms—into five categories distinguished by the intensities of their sustained winds. This measuring system was formerly known as the Saffir–Simpson hurricane scale, or SSHS.

<span class="mw-page-title-main">Weber–Fechner law</span> Related laws in the field of psychophysics

The Weber–Fechner laws are two related scientific laws in the field of psychophysics, known as Weber's law and Fechner's law. Both relate to human perception, more specifically the relation between the actual change in a physical stimulus and the perceived change. This includes stimuli to all senses: vision, hearing, taste, touch, and smell.

The Lotka–Volterra equations, also known as the Lotka–Volterra predator–prey model, are a pair of first-order nonlinear differential equations, frequently used to describe the dynamics of biological systems in which two species interact, one as a predator and the other as prey. The populations change through time according to the pair of equations:

<span class="mw-page-title-main">Japan Meteorological Agency seismic intensity scale</span> Japanese earthquake measurements

The Japan Meteorological Agency (JMA) Seismic Intensity Scale is a seismic intensity scale used in Japan to categorize the intensity of local ground shaking caused by earthquakes.

In probability theory and statistics, the generalized extreme value (GEV) distribution is a family of continuous probability distributions developed within extreme value theory to combine the Gumbel, Fréchet and Weibull families also known as type I, II and III extreme value distributions. By the extreme value theorem the GEV distribution is the only possible limit distribution of properly normalized maxima of a sequence of independent and identically distributed random variables. Note that a limit distribution needs to exist, which requires regularity conditions on the tail of the distribution. Despite this, the GEV distribution is often used as an approximation to model the maxima of long (finite) sequences of random variables.

The Enhanced Fujita scale rates tornado intensity based on the severity of the damage they cause. It is used in some countries, including the United States and France. The EF scale is also unofficially used in other countries, including China.

The Richter scale, also called the Richter magnitude scale, Richter's magnitude scale, and the Gutenberg–Richter scale, is a measure of the strength of earthquakes, developed by Charles Richter in collaboration with Beno Gutenberg, and presented in Richter's landmark 1935 paper, where he called it the "magnitude scale". This was later revised and renamed the local magnitude scale, denoted as ML or ML .

The Sperry–Piltz Ice Accumulation Index, or SPIA Index, is a scale for rating ice storm intensity, based on the expected footprint of an ice storm, the expected ice accumulation as a result of a storm, and the expected damage a storm inflicts on human-built structures, especially exposed overhead utility systems such as power lines.

The Datar–Mathews Method is a method for real options valuation. The method provides an easy way to determine the real option value of a project simply by using the average of positive outcomes for the project. The method can be understood as an extension of the net present value (NPV) multi-scenario Monte Carlo model with an adjustment for risk aversion and economic decision-making. The method uses information that arises naturally in a standard discounted cash flow (DCF), or NPV, project financial valuation. It was created in 2000 by Vinay Datar, professor at Seattle University; and Scott H. Mathews, Technical Fellow at The Boeing Company.

<i>Storm Data</i> Academic journal

Storm Data and Unusual Weather Phenomena (SD) is a monthly NOAA publication with comprehensive listings and detailed summaries of severe weather occurrences in the United States. Included is information on tornadoes, high wind events, hail, lightning, floods and flash floods, tropical cyclones (hurricanes), ice storms, snow, extreme temperatures such as heat waves and cold waves, droughts, and wildfires. Photographs of weather and attendant damage are used as much as possible. Maps of significant weather are also included.

References

  1. 1 2 Rohn, Eli and Blackmore, Denis (2009) A Unified Localizable Emergency Events Scale, International Journal of Information Systems for Crisis Response Management (IJISCRAM), Volume 1, Issue 4, October 2009
  2. "FEMA Intensity Scales". Archived from the original on 24 September 2010. Retrieved 13 September 2010.
  3. Gomeza, Elizabeth; Plotnick, Linda; Rohn, Eli; Morgan, Jon; Turoff, Murray (2007). "The US is replacing its historical federalist". 2007 40th Annual Hawaii International Conference on System Sciences (HICSS'07). p. 23. doi:10.1109/HICSS.2007.557.
  4. Plotnick, Linda; Gomez, Elizabeth; White, Connie; Turoff, Murray (May 2007). "Furthering Development of a Unified Emergency Scale Using Thurstone's Law of Comparative Judgment: A Progress Report". ISCRAM. CiteSeerX   10.1.1.103.5779 .
  5. Turoff Murray and Hiltz Roxanne (2008). Assessing the health information needs of the emergency preparedness and management community. Inf. Serv. Use 28, 3–4 (Aug. 2008), 269–280.
  6. Turoff, M., White, C., Plotnick, L., and Hiltz, S. R., Dynamic Emergency Response Management for Large Scale Decision Making in Extreme Events, Proceedings of ISCRAM 2008, Washington D.C. May.
  7. "The Beaufort Wind Scale". National Oceanic and Atmospheric Administration, Storm Prediction Center. Retrieved 13 September 2010.
  8. "Modified Mercalli Intensity Scale". U.S. Geological Survey, Earthquake Hazards Program. Retrieved 13 September 2010.
  9. "The Richter Magnitude Scale". U.S. Geological Survey, Earthquake Hazards Program. Archived from the original on 26 September 2010. Retrieved 13 September 2010.
  10. "The Saffir-Simpson Hurricane Wind Scale". National U.S. Oceanic and Atmospheric Administration – National Hurricane Center. Retrieved 13 September 2010.
  11. "Fujita Tornado Damage Scale". National Oceanic and Atmospheric Administration. Retrieved 13 September 2010.
  12. "The Enhanced Fujita Scale (EF Scale)". National Oceanic and Atmospheric Administration.
  13. IAEA fact sheet
  14. "Air-Quality Index". The U.S. EPA, NOAA and NPS AIRnow Project. Retrieved 13 September 2010.
  15. "The Emergency Scale Website". Archived from the original on 29 January 2011. Retrieved 8 March 2011.
  16. "CNN Money (2007)" . Retrieved 13 September 2010.