A global catastrophic risk is a hypothetical future event which could damage human well-being on a global scale,even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an existential risk.
Potential global catastrophic risks include anthropogenic risks, caused by humans (technology, governance, climate change), and non-anthropogenic or external risks.Examples of technology risks are hostile artificial intelligence and destructive biotechnology or nanotechnology. Insufficient or malign global governance creates risks in the social and political domain, such as a global war, including nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructure like the electrical grid; or the failure to manage a natural pandemic. Problems and risks in the domain of earth system governance include global warming, environmental degradation, including extinction of species, famine as a result of non-equitable resource distribution, human overpopulation, crop failures and non-sustainable agriculture.
Examples of non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying electronic equipment, natural long-term climate change, hostile extraterrestrial life, or the predictable Sun transforming into a red giant star engulfing the Earth.
A global catastrophic risk is any risk that is at least global in scope, and is not subjectively imperceptible in intensity. Those that will affect all future generations and are "terminal"[ clarification needed ] in intensity are classified as existential risks. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand, is one that either destroys humanity entirely or prevents any chance of civilization's recovery.
Similarly, in Catastrophe: Risk and Response , Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Posner singles out such events as worthy of special attention on cost-benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.Posner's events include meteor impacts, runaway global warming, grey goo, bioterrorism, and particle accelerator accidents.
Studying near-human extinction directly is not possible, and modelling existential risks is difficult, due in part to survivorship bias.However, individual civilizations have collapsed many times in human history. While there is no known precedent for a complete collapse into an amnesic pre-agricultural society, civilizations such as the Roman Empire have ended in a loss of centralized governance and a major civilization-wide loss of infrastructure and advanced technology. Societies are often resilient to catastrophe; for example, Medieval Europe survived the Black Death without suffering anything resembling a civilization collapse.
Some risks are due to phenomena that have occurred in Earth's past and left a geological record. Together with contemporary observations, it is possible to make informed estimates of the likelihood such events will occur in the future. For example, an extinction-level comet or asteroid impact event before the year 2100 has been estimated at one-in-a-million. [ further explanation needed ] Supervolcanoes are another example. There are several known historical supervolcanoes, including Mt. Toba, which some say almost wiped out humanity at the time of its last eruption. The geologic record suggests this particular supervolcano re-erupts about every 50,000 years. [ further explanation needed ]
Without the benefit of geological records and direct observation, the relative danger posed by other threats is much more difficult to calculate. In addition, it is one thing to estimate the likelihood of an event taking place, but quite another to assess how likely an event is to cause extinction if it does occur, and most difficult of all, the risk posted by synergistic effects of multiple events taking place simultaneously.[ citation needed ]
The closest the Doomsday Clock has been to midnight is 2020, when the Clock was set to one minute forty seconds until midnight, due to continued relations troubles between the North Korean and United States governments, as well as rising tensions between the US and Iran.
Given the limitations of ordinary calculation and modeling, expert elicitation is frequently used instead to obtain probability estimates.In 2008, an informal survey of experts on different global catastrophic risks at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19% chance of human extinction by the year 2100. The conference report cautions that the results should be taken "with a grain of salt"; the results were not meant to capture all large risks and did not include things like climate change, and the results likely reflect many cognitive biases of the conference participants.
|Risk||Estimated probability |
for human extinction
|Molecular nanotechnology weapons|
|All wars (including civil wars)|
The 2016 annual report by the Global Challenges Foundation estimates that an average American is more than five times more likely to die during a human-extinction event than in a car crash.
There are significant methodological challenges in estimating these risks with precision. Most attention has been given to risks to human civilization over the next hundred years, but forecasting for this length of time is difficult. The types of threats posed by nature have been argued to be relatively constant, though this has been disputed,and new risks could be discovered. Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while volcanoes have been a threat throughout history, nuclear weapons have been an issue only since the 20th century. Historically, the ability of experts to predict the future over these timescales has proved very limited. Man-made threats such as nuclear war or nanotechnology are harder to predict than natural threats, due to the inherent methodological difficulties in the social sciences. In general, it is hard to estimate the magnitude of the risk from this or other dangers, especially as both international relations and technology can change rapidly.
Existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects. Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has no observers, so regardless of their frequency, no civilization observes existential risks in its history.These anthropic issues can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.
In addition to known and tangible risks, unforeseeable black swan extinction events may occur, presenting an additional methodological problem.
Some scholars have strongly favored reducing existential risk on the grounds that it greatly benefits future generations. Derek Parfit argues that extinction would be a great loss because our descendants could potentially survive for four billion years before the expansion of the Sun makes the Earth uninhabitable.Nick Bostrom argues that there is even greater potential in colonizing space. If future humans colonize space, they may be able to support a very large number of people on other planets, potentially lasting for trillions of years. Therefore, reducing existential risk by even a small amount would have a very significant impact on the expected number of people who will exist in the future.
Exponential discounting might make these future benefits much less significant. However, Jason Matheny has argued that such discounting is inappropriate when assessing the value of existential risk reduction.
Some economists have discussed the importance of global catastrophic risks, though not existential risks. Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.Richard Posner has argued that humanity is doing far too little, in general, about small, hard-to-estimate risks of large-scale catastrophes.
Numerous cognitive biases can influence people's judgment of the importance of existential risks, including scope insensitivity, hyperbolic discounting, availability heuristic, the conjunction fallacy, the affect heuristic, and the overconfidence effect.
Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as concerned about 200,000 birds getting stuck in oil as they are about 2,000.Similarly, people are often more concerned about threats to individuals than to larger groups.
There are economic reasons that can explain why so little effort is going into existential risk reduction. It is a global good, so even if a large nation decreases it, that nation will enjoy only a small fraction of the benefit of doing so. Furthermore, the vast majority of the benefits may be enjoyed by far future generations, and though these quadrillions of future people would in theory perhaps be willing to pay massive sums for existential risk reduction, no mechanism for such a transaction exists.
Some sources of catastrophic risk are natural, such as meteor impacts or supervolcanoes. Some of these have caused mass extinctions in the past. On the other hand, some risks are man-made, such as global warming,environmental degradation, engineered pandemics and nuclear war.
The Cambridge Project at Cambridge University says the "greatest threats" to the human species are man-made; they are artificial intelligence, global warming, nuclear war, and rogue biotechnology.The Future of Humanity Institute also states that human extinction is more likely to result from anthropogenic causes than natural causes.
It has been suggested that if AI systems rapidly become superintelligent, they may take unforeseen actions or out-compete humanity.According to philosopher Nick Bostrom, it is possible that the first superintelligence to emerge would be able to bring about almost any possible outcome it valued, as well as to foil virtually any attempt to prevent it from achieving its objectives. Thus, even a superintelligence indifferent to humanity could be dangerous if it perceived humans as an obstacle to unrelated goals. In Bostrom's book Superintelligence , he defines this as the control problem. Physicist Stephen Hawking, Microsoft founder Bill Gates, and SpaceX founder Elon Musk have echoed these concerns, with Hawking theorizing that such an AI could "spell the end of the human race".
In 2009, the Association for the Advancement of Artificial Intelligence (AAAI) hosted a conference to discuss whether computers and robots might be able to acquire any sort of autonomy, and how much these abilities might pose a threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved "cockroach intelligence". They noted that self-awareness as depicted in science-fiction is probably unlikely, but there are other potential hazards and pitfalls.Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns.
A survey of AI experts estimated that the chance of human-level machine learning having an "extremely bad (e.g., human extinction)" long-term effect on humanity is 5%.A 2008 survey by the Future of Humanity Institute estimated a 5% probability of extinction by superintelligence by 2100. Eliezer Yudkowsky believes risks from artificial intelligence are harder to predict than any other known risks due to bias from anthropomorphism. Since people base their judgments of artificial intelligence on their own experience, he claims they underestimate the potential power of AI.
Biotechnology can pose a global catastrophic risk in the form of bioengineered organisms (viruses, bacteria, fungi, plants or animals). In many cases the organism will be a pathogen of humans, livestock, crops or other organisms we depend upon (e.g. pollinators or gut bacteria). However, any organism able to catastrophically disrupt ecosystem functions, e.g. highly competitive weeds, outcompeting essential crops, poses a biotechnology risk.
A biotechnology catastrophe may be caused by accidentally releasing a genetically engineered organism from controlled environments, by the planned release of such an organism which then turns out to have unforeseen and catastrophic interactions with essential natural or agro-ecosystems, or by intentional usage of biological agents in biological warfare or bioterrorism attacks.Pathogens may be intentionally or unintentionally genetically modified to change virulence and other characteristics. For example, a group of Australian researchers unintentionally changed characteristics of the mousepox virus while trying to develop a virus to sterilize rodents. The modified virus became highly lethal even in vaccinated and naturally resistant mice. The technological means to genetically modify virus characteristics are likely to become more widely available in the future if not properly regulated.
Terrorist applications of biotechnology have historically been infrequent. To what extent this is due to a lack of capabilities or motivation is not resolved.However, given current development, more risk from novel, engineered pathogens is to be expected in the future. Exponential growth has been observed in the biotechnology sector, and Noun and Chyba predict that this will lead to major increases in biotechnological capabilities in the coming decades. They argue that risks from biological warfare and bioterrorism are distinct from nuclear and chemical threats because biological pathogens are easier to mass-produce and their production is hard to control (especially as the technological capabilities are becoming available even to individual users). In 2008, a survey by the Future of Humanity Institute estimated a 2% probability of extinction from engineered pandemics by 2100.
Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks and developing facilities to mitigate disease outbreaks (e.g. better and/or more widely distributed vaccines).
Cyberattacks have the potential to destroy everything from personal data to electric grids. Christine Peterson, co-founder and past president of the Foresight Institute, believes a cyberattack on electric grids has the potential to be a catastrophic risk. She notes that little has been done to mitigate such risks, and that mitigation could take several decades of readjustment.
An environmental or ecological disaster, such as world crop failure and collapse of ecosystem services, could be induced by the present trends of overpopulation, economic development, and non-sustainable agriculture. Most environmental scenarios involve one or more of the following: Holocene extinction event,scarcity of water that could lead to approximately half the Earth's population being without safe drinking water, pollinator decline, overfishing, massive deforestation, desertification, climate change, or massive water pollution episodes. Detected in the early 21st century, a threat in this direction is colony collapse disorder, a phenomenon that might foreshadow the imminent extinction of the Western honeybee. As the bee plays a vital role in pollination, its extinction would severely disrupt the food chain.
An October 2017 report published in The Lancet stated that toxic air, water, soils, and workplaces were collectively responsible for nine million deaths worldwide in 2015, particularly from air pollution which was linked to deaths by increasing susceptibility to non-infectious diseases, such as heart disease, stroke, and lung cancer.The report warned that the pollution crisis was exceeding "the envelope on the amount of pollution the Earth can carry" and "threatens the continuing survival of human societies".
Nick Bostrom suggested that in the pursuit of knowledge, humanity might inadvertently create a device that could destroy Earth and the Solar System.Investigations in nuclear and high-energy physics could create unusual conditions with catastrophic consequences. For example, scientists worried that the first nuclear test might ignite the atmosphere. Others worried that the RHIC or the Large Hadron Collider might start a chain-reaction global disaster involving black holes, strangelets, or false vacuum states. These particular concerns have been challenged, but the general concern remains.
Biotechnology could lead to the creation of a pandemic, chemical warfare could be taken to an extreme, nanotechnology could lead to grey goo in which out-of-control self-replicating robots consume all living matter on earth while building more of themselves—in both cases, either deliberately or by accident.
Global warming refers to the warming caused by human technology since the 19th century or earlier. Projections of future climate change suggest further global warming, sea level rise, and an increase in the frequency and severity of some extreme weather events and weather-related disasters. Effects of global warming include loss of biodiversity, stresses to existing food-producing systems, increased spread of known infectious diseases such as malaria, and rapid mutation of microorganisms. In November 2017, a statement by 15,364 scientists from 184 countries indicated that increasing levels of greenhouse gases from use of fossil fuels, human population growth, deforestation, and overuse of land for agricultural production, particularly by farming ruminants for meat consumption, are trending in ways that forecast an increase in human misery over coming decades.
Romanian American economist Nicholas Georgescu-Roegen, a progenitor in economics and the paradigm founder of ecological economics, has argued that the carrying capacity of Earth—that is, Earth's capacity to sustain human populations and consumption levels—is bound to decrease sometime in the future as Earth's finite stock of mineral resources is presently being extracted and put to use; and consequently, that the world economy as a whole is heading towards an inevitable future collapse, leading to the demise of human civilization itself. 303f Ecological economist and steady-state theorist Herman Daly, a student of Georgescu-Roegen, has propounded the same argument by asserting that "... all we can do is to avoid wasting the limited capacity of creation to support present and future life [on Earth]." :370:
Ever since Georgescu-Roegen and Daly published these views, various scholars in the field have been discussing the existential impossibility of allocating earth's finite stock of mineral resources evenly among an unknown number of present and future generations. This number of generations is likely to remain unknown to us, as there is no way—or only little way—of knowing in advance if or when mankind will ultimately face extinction. In effect, any conceivable intertemporal allocation of the stock will inevitably end up with universal economic decline at some future point. 253–256 :165 :168–171 :150–153 :106–109 :546–549 :142–145:
Many nanoscale technologies are in development or currently in use.The only one that appears to pose a significant global catastrophic risk is molecular manufacturing, a technique that would make it possible to build complex structures at atomic precision. Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories of desktop proportions. When nanofactories gain the ability to produce other nanofactories, production may only be limited by relatively abundant factors such as input materials, energy and software.
Molecular manufacturing could be used to cheaply produce, among many other products, highly advanced, durable weapons.Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities.
Chris Phoenix and Treder classify catastrophic risks posed by nanotechnology into three categories:
Several researchers say the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government.Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races):
Since self-regulation by all state and non-state actors seems hard to achieve,measures to mitigate war-related risks have mainly been proposed in the area of international cooperation. International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control. International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed. One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour. The Center for Responsible Nanotechnology also suggests some technical restrictions. Improved transparency regarding technological capabilities may be another important facilitator for arms-control.
Grey goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creationand has been a theme in mainstream media and fiction. This scenario involves tiny self-replicating robots that consume the entire biosphere using it as a source of energy and building blocks. Nowadays, however, nanotech experts—including Drexler—discredit the scenario. According to Phoenix, a "so-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident".
The scenarios that have been explored most frequently are nuclear warfare and doomsday devices. Mistakenly launching a nuclear attack in response to a false alarm is one possible scenario; this nearly happened during the 1983 Soviet nuclear false alarm incident. Although the probability of a nuclear war per year is slim, Professor Martin Hellman has described it as inevitable in the long run; unless the probability approaches zero, inevitably there will come a day when civilization's luck runs out.During the Cuban Missile Crisis, U.S. president John F. Kennedy estimated the odds of nuclear war at "somewhere between one out of three and even". The United States and Russia have a combined arsenal of 14,700 nuclear weapons, and there is an estimated total of 15,700 nuclear weapons in existence worldwide. Beyond nuclear, other military threats to humanity include biological warfare (BW). By contrast, chemical warfare, while able to create multiple local catastrophes, is unlikely to create a global one.
Nuclear war could yield unprecedented human death tolls and habitat destruction. Detonating large numbers of nuclear weapons would have an immediate, short term and long-term effects on the climate, causing cold weather and reduced sunlight and photosynthesisthat may generate significant upheaval in advanced civilizations. However, while popular perception sometimes takes nuclear war as "the end of the world", experts assign low probability to human extinction from nuclear war. In 1982, Brian Martin estimated that a US–Soviet nuclear exchange might kill 400–450 million directly, mostly in the United States, Europe and Russia, and maybe several hundred million more through follow-up consequences in those same areas. In 2008, a survey by the Future of Humanity Institute estimated a 4% probability of extinction from warfare by 2100, with a 1% chance of extinction from nuclear warfare.
The 20th century saw a rapid increase in human population due to medical developments and massive increases in agricultural productivitysuch as the Green Revolution. Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. The Green Revolution in agriculture helped food production to keep pace with worldwide population growth or actually enabled population growth. The energy for the Green Revolution was provided by fossil fuels in the form of fertilizers (natural gas), pesticides (oil), and hydrocarbon-fueled irrigation. David Pimentel, professor of ecology and agriculture at Cornell University, and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in their 1994 study Food, Land, Population and the U.S. Economy the maximum U.S. population for a sustainable economy at 200 million. To achieve a sustainable economy and avert disaster, the United States must reduce its population by at least one-third, and world population will have to be reduced by two-thirds, says the study.
The authors of this study believe the mentioned agricultural crisis will begin to have an effect on the world after 2020, and will become critical after 2050. Geologist Dale Allen Pfeiffer claims that coming decades could see spiraling food prices without relief and massive starvation on a global level such as never experienced before.
Since supplies of petroleum and natural gas are essential to modern agriculture techniques, a fall in global oil supplies (see peak oil for global concerns) could cause spiking food prices and unprecedented famine in the coming decades.
Wheat is humanity's third-most-produced cereal. Extant fungal infections such as Ug99(a kind of stem rust) can cause 100% crop losses in most modern varieties. Little or no treatment is possible and infection spreads on the wind. Should the world's large grain-producing areas become infected, the ensuing crisis in wheat availability would lead to price spikes and shortages in other food products.
Several asteroids have collided with Earth in recent geological history. The Chicxulub asteroid, for example, was about six miles in diameter and is theorized to have caused the extinction of non-avian dinosaurs at the end of the Cretaceous. No sufficiently large asteroid currently exists in an Earth-crossing orbit; however, a comet of sufficient size to cause human extinction could impact the Earth, though the annual probability may be less than 10−8. km diameter have impacted the Earth on average once every 500,000 years; these are probably too small to pose an extinction risk, but might kill billions of people. Larger asteroids are less common. Small near-Earth asteroids are regularly observed and can impact anywhere on the Earth injuring local populations. As of 2013, Spaceguard estimates it has identified 95% of all NEOs over 1 km in size.Geoscientist Brian Toon estimates that while a few people, such as "some fishermen in Costa Rica", could plausibly survive a six-mile meteorite, a sixty-mile meteorite would be large enough to "incinerate everybody". Asteroids with around a 1
In April 2018, the B612 Foundation reported "It's a 100 per cent certain we'll be hit [by a devastating asteroid], but we're not 100 per cent sure when."Also in 2018, physicist Stephen Hawking, in his final book Brief Answers to the Big Questions , considered an asteroid collision to be the biggest threat to the planet. In June 2018, the US National Science and Technology Council warned that America is unprepared for an asteroid impact event, and has developed and released the "National Near-Earth Object Preparedness Strategy Action Plan" to better prepare. According to expert testimony in the United States Congress in 2013, NASA would require at least five years of preparation before a mission to intercept an asteroid could be launched.
A number of astronomical threats have been identified. Massive objects, e.g. a star, large planet or black hole, could be catastrophic if a close encounter occurred in the Solar System. In April 2008, it was announced that two simulations of long-term planetary movement, one at the Paris Observatory and the other at the University of California, Santa Cruz, indicate a 1% chance that Mercury's orbit could be made unstable by Jupiter's gravitational pull sometime during the lifespan of the Sun. Were this to happen, the simulations suggest a collision with Earth could be one of four possible outcomes (the others being Mercury colliding with the Sun, colliding with Venus, or being ejected from the Solar System altogether). If Mercury were to collide with Earth, all life on Earth could be obliterated entirely: an asteroid 15 km wide is believed to have caused the extinction of the non-avian dinosaurs, whereas Mercury is 4,879 km in diameter.
If our universe lies within a false vacuum, a bubble of lower-energy vacuum could come to exist by chance or otherwise in our universe, and catalyze the conversion of our universe to a lower energy state in a volume expanding at nearly the speed of light, destroying all that we know without forewarning. Such an occurrence is called vacuum decay.
Another cosmic threat is a gamma-ray burst, typically produced by a supernova when a star collapses inward on itself and then "bounces" outward in a massive explosion. Under certain circumstances, these events are thought to produce massive bursts of gamma radiation emanating outward from the axis of rotation of the star. If such an event were to occur oriented towards the Earth, the massive amounts of gamma radiation could significantly affect the Earth's atmosphere and pose an existential threat to all life. Such a gamma-ray burst may have been the cause of the Ordovician–Silurian extinction events. Neither this scenario nor the destabilization of Mercury's orbit are likely in the foreseeable future.
A powerful solar flare or solar superstorm, which is a drastic and unusual decrease or increase in the Sun's power output, could have severe consequences for life on Earth.
Astrophysicists currently calculate that in a few billion years the Earth will probably be swallowed by the expansion of the Sun into a red giant star.
Intelligent extraterrestrial life, if existent, could invade Eartheither to exterminate and supplant human life, enslave it under a colonial system, steal the planet's resources, or destroy the planet altogether.
Although evidence of alien life has never been proven, scientists such as Carl Sagan have postulated that the existence of extraterrestrial life is very likely. In 1969, the "Extra-Terrestrial Exposure Law" was added to the United States Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the U.S. Apollo Space Program. It was removed in 1991.Scientists consider such a scenario technically possible, but unlikely.
An article in The New York Times discussed the possible threats for humanity of intentionally sending messages aimed at extraterrestrial life into the cosmos in the context of the SETI efforts. Several renowned public figures such as Stephen Hawking and Elon Musk have argued against sending such messages on the grounds that extraterrestrial civilizations with technology are probably far more advanced than humanity and could pose an existential threat to humanity.
There are numerous historical examples of pandemicsthat have had a devastating effect on a large number of people. The present, unprecedented scale and speed of human movement make it more difficult than ever to contain an epidemic through local quarantines, and other sources of uncertainty and the evolving nature of the risk means natural pandemics may pose a realistic threat to human civilization.
There are several classes of argument about the likelihood of pandemics. One class of argument about likelihood stems from the history of pandemics, where the limited size of historical pandemics is evidence that larger pandemics are unlikely. This argument has been disputed on several grounds, including the changing risk due to changing population and behavioral patterns among humans, the limited historical record, and the existence of an anthropic bias.
Another argument about the likelihood of pandemics is based on an evolutionary model that predicts that naturally evolving pathogens will ultimately develop an upper limit to their virulence.This is because pathogens with high enough virulence quickly kill their hosts and reduce their chances of spread the infection to new hosts or carriers. This model has limits, however, because the fitness advantage of limited virulence is primarily a function of a limited number of hosts. Any pathogen with a high virulence, high transmission rate and long incubation time may have already caused a catastrophic pandemic before ultimately virulence is limited through natural selection. Additionally, a pathogen that infects humans as a secondary host and primarily infects another species (a zoonosis) has no constraints on its virulence in people, since the accidental secondary infections do not affect its evolution. Lastly, in models where virulence level and rate of transmission are related, high levels of virulence can evolve. Virulence is instead limited by the existence of complex populations of hosts with different susceptibilities to infection, or by some hosts being geographically isolated. The size of the host population and competition between different strains of pathogens can also alter virulence.
Neither of these arguments is applicable to bioengineered pathogens, and this poses entirely different risks of pandemics. Experts have concluded that "Developments in science and technology could significantly ease the development and use of high consequence biological weapons," and these "highly virulent and highly transmissible [bio-engineered pathogens] represent new potential pandemic threats."
Climate change refers to a lasting change in the Earth's climate. The climate has ranged from ice ages to warmer periods when palm trees grew in Antarctica. It has been hypothesized that there was also a period called "snowball Earth" when all the oceans were covered in a layer of ice. These global climatic changes occurred slowly, near the end of the last Major Ice Age when the climate became more stable. However, abrupt climate change on the decade time scale has occurred regionally. A natural variation into a new climate regime (colder or hotter) could pose a threat to civilization.
In the history of the Earth, many ice ages are known to have occurred. An ice age would have a serious impact on civilization because vast areas of land (mainly in North America, Europe, and Asia) could become uninhabitable. Currently, the world is in an interglacial period within a much older glacial event. The last glacial expansion ended about 10,000 years ago, and all civilizations evolved later than this. Scientists do not predict that a natural ice age will occur anytime soon.[ citation needed ] The amount of heat trapping gases emitted into Earth's Oceans and atmosphere will prevent the next ice age, which otherwise would begin in around 50,000 years, and likely more glacial cycles.
A geological event such as massive flood basalt, volcanism, or the eruption of a supervolcanocould lead to a so-called volcanic winter, similar to a nuclear winter. One such event, the Toba eruption, occurred in Indonesia about 71,500 years ago. According to the Toba catastrophe theory, the event may have reduced human populations to only a few tens of thousands of individuals. Yellowstone Caldera is another such supervolcano, having undergone 142 or more caldera-forming eruptions in the past 17 million years. A massive volcano eruption would eject extraordinary volumes of volcanic dust, toxic and greenhouse gases into the atmosphere with serious effects on global climate (towards extreme global cooling: volcanic winter if short-term, and ice age if long-term) or global warming (if greenhouse gases were to prevail).
When the supervolcano at Yellowstone last erupted 640,000 years ago, the thinnest layers of the ash ejected from the caldera spread over most of the United States west of the Mississippi River and part of northeastern Mexico. The magma covered much of what is now Yellowstone National Park and extended beyond, covering much of the ground from Yellowstone River in the east to the Idaho falls in the west, with some of the flows extending north beyond Mammoth Springs.
According to a recent study, if the Yellowstone caldera erupted again as a supervolcano, an ash layer one to three millimeters thick could be deposited as far away as New York, enough to "reduce traction on roads and runways, short out electrical transformers and cause respiratory problems". There would be centimeters of thickness over much of the U.S. Midwest, enough to disrupt crops and livestock, especially if it happened at a critical time in the growing season. The worst-affected city would likely be Billings, Montana, population 109,000, which the model predicted would be covered with ash estimated as 1.03 to 1.8 meters thick.
The main long-term effect is through global climate change, which reduces the temperature globally by about 5–15 degrees C for a decade, together with the direct effects of the deposits of ash on their crops. A large supervolcano like Toba would deposit one or two meters thickness of ash over an area of several million square kilometers.(1000 cubic kilometers is equivalent to a one-meter thickness of ash spread over a million square kilometers). If that happened in some densely populated agricultural area, such as India, it could destroy one or two seasons of crops for two billion people.
However, Yellowstone shows no signs of a supereruption at present, and it is not certain that a future supereruption will occur there.
Research published in 2011 finds evidence that massive volcanic eruptions caused massive coal combustion, supporting models for significant generation of greenhouse gases. Researchers have suggested that massive volcanic eruptions through coal beds in Siberia would generate significant greenhouse gases and cause a runaway greenhouse effect.Massive eruptions can also throw enough pyroclastic debris and other material into the atmosphere to partially block out the sun and cause a volcanic winter, as happened on a smaller scale in 1816 following the eruption of Mount Tambora, the so-called Year Without a Summer. Such an eruption might cause the immediate deaths of millions of people several hundred miles from the eruption, and perhaps billions of death worldwide, due to the failure of the monsoons, resulting in major crop failures causing starvation on a profound scale.
A much more speculative concept is the verneshot: a hypothetical volcanic eruption caused by the buildup of gas deep underneath a craton. Such an event may be forceful enough to launch an extreme amount of material from the crust and mantle into a sub-orbital trajectory.
Planetary management and respecting planetary boundaries have been proposed as approaches to preventing ecological catastrophes. Within the scope of these approaches, the field of geoengineering encompasses the deliberate large-scale engineering and manipulation of the planetary environment to combat or counteract anthropogenic changes in atmospheric chemistry. Space colonization is a proposed alternative to improve the odds of surviving an extinction scenario.Solutions of this scope may require megascale engineering. Food storage has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to malnutrition.
Some survivalists stock survival retreats with multiple-year food supplies.
The Svalbard Global Seed Vault is buried 400 feet (120 m) inside a mountain on an island in the Arctic. It is designed to hold 2.5 billion seeds from more than 100 countries as a precaution to preserve the world's crops. The surrounding rock is −6 °C (21 °F) (as of 2015) but the vault is kept at −18 °C (0 °F) by refrigerators powered by locally sourced coal.
More speculatively, if society continues to function and if the biosphere remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning. Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.
Insufficient global governance creates risks in the social and political domain, but the governance mechanisms develop more slowly than technological and social change. There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests. This is further underlined by an understanding of the interconnectedness of global systemic risks.In absence or anticipation of global governance, national governments can act individually to better understand, mitigate and prepare for global catastrophes.
In 2018, the Club of Rome called for greater climate change action and published its Climate Emergency Plan, which proposes ten action points to limit global average temperature increase to 1.5 degrees Celsius.Further, in 2019, the Club published the more comprehensive Planetary Emergency Plan.
The Bulletin of the Atomic Scientists (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947. The Foresight Institute (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by K. Eric Drexler who postulated "grey goo".
Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.
Independent non-governmental organizations (NGOs) include the Machine Intelligence Research Institute (est. 2000), which aims to reduce the risk of a catastrophe caused by artificial intelligence,with donors including Peter Thiel and Jed McCaleb. The Nuclear Threat Initiative (est. 2001) seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event. It maintains a nuclear material security index. The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe. Most of the research money funds projects at universities. The Global Catastrophic Risk Institute (est. 2011) is a think tank for catastrophic risk. It is funded by the NGO Social and Environmental Entrepreneurs. The Global Challenges Foundation (est. 2012), based in Stockholm and founded by Laszlo Szombatfalvy, releases a yearly report on the state of global risks. The Future of Life Institute (est. 2014) aims to support research and initiatives for safeguarding life considering new technologies and challenges facing humanity. Elon Musk is one of its biggest donors. The Center on Long-Term Risk (est. 2016), formerly known as the Foundational Research Institute, is a British organization focused on reducing risks of astronomical suffering (s-risks) from emerging technologies.
University-based organizations include the Future of Humanity Institute (est. 2005) which researches the questions of humanity's long-term future, particularly existential risk. It was founded by Nick Bostrom and is based at Oxford University. The Centre for the Study of Existential Risk (est. 2012) is a Cambridge-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare. All are man-made risks, as Huw Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us."Stephen Hawking was an acting adviser. The Millennium Alliance for Humanity and the Biosphere is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academic in the humanities. It was founded by Paul Ehrlich among others. Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk. The Center for Security and Emerging Technology was established in January 2019 at Georgetown's Walsh School of Foreign Service and will focus on policy research of emerging technologies with an initial emphasis on artificial intelligence. They received a grant of 55M USD from Good Ventures as suggested by the Open Philanthropy Project.
Other risk assessment groups are based in or are part of governmental organizations. The World Health Organization (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis.GAR helps member states with training and coordination of response to epidemics. The United States Agency for International Development (USAID) has its Emerging Pandemic Threats Program which aims to prevent and contain naturally generated pandemics at their source. The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security and counter-terrorism.
Europe survived losing 25 to 50 percent of its population in the Black Death, while keeping civilization firmly intact
The USGS puts it like this: "If another large caldera-forming eruption were to occur at Yellowstone, its effects would be worldwide. Thick ash deposits would bury vast areas of the United States, and injection of huge volumes of volcanic gases into the atmosphere could drastically affect global climate. Fortunately, the Yellowstone volcanic system shows no signs that it is headed toward such an eruption. The probability of a large caldera-forming eruption within the next few thousand years is exceedingly low."
The Machine Intelligence Research Institute aims to reduce the risk of a catastrophe, should such an event eventually occur.
We currently focus on efforts to reduce the worst risks of astronomical suffering (s-risks) from emerging technologies, with a focus on transformative artificial intelligence.
Apocalyptic and post-apocalyptic fiction is a subgenre of science fiction, science fantasy, dystopia or horror in which the Earth's technological civilization is collapsing or has collapsed. The apocalypse event may be climatic, such as runaway climate change; astronomical, such as an impact event; destructive, such as nuclear holocaust or resource depletion; medical, such as a pandemic, whether natural or human-caused; end time, such as the Last Judgment, Second Coming or Ragnarök; or more imaginative, such as a zombie apocalypse, cybernetic revolt, technological singularity, dysgenics or alien invasion.
An impact winter is a hypothesized period of prolonged cold weather due to the impact of a large asteroid or comet on the Earth's surface. If an asteroid were to strike land or a shallow body of water, it would eject an enormous amount of dust, ash, and other material into the atmosphere, blocking the radiation from the Sun. This would cause the global temperature to decrease drastically. If an asteroid or comet with the diameter of about 5 km (3.1 mi) or more were to hit in a large deep body of water or explode before hitting the surface, there would still be an enormous amount of debris ejected into the atmosphere. It has been proposed that an impact winter could lead to mass extinction, wiping out many of the world's existing species.
Nick Bostrom is a Swedish-born philosopher at the University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, superintelligence risks, and the reversal test. In 2011, he founded the Oxford Martin Programme on the Impacts of Future Technology, and is the founding director of the Future of Humanity Institute at Oxford University. In 2009 and 2015, he was included in Foreign Policy's Top 100 Global Thinkers list.
Space and survival is the idea that the long-term survival of the human species and technological civilization requires the building of a spacefaring civilization that utilizes the resources of outer space, and that not doing this could lead to human extinction. A related observation is that the window of opportunity for doing this may be limited due to the decreasing amount of surplus resources that will be available over time as a result of an ever-growing population.
The Great Filter, in the context of the Fermi paradox, is whatever prevents non-living matter from undergoing abiogenesis, in time, to expanding lasting life as measured by the Kardashev scale. The concept originates in Robin Hanson's argument that the failure to find any extraterrestrial civilizations in the observable universe implies the possibility something is wrong with one or more of the arguments from various scientific disciplines that the appearance of advanced intelligent life is probable; this observation is conceptualized in terms of a "Great Filter" which acts to reduce the great number of sites where intelligent life might arise to the tiny number of intelligent species with advanced civilizations actually observed. This probability threshold, which could lie behind us or in front of us, might work as a barrier to the evolution of intelligent life, or as a high probability of self-destruction. The main counter-intuitive conclusion of this observation is that the easier it was for life to evolve to our stage, the bleaker our future chances probably are.
In futures studies, human extinction is the hypothetical complete end of the human species. This may result either from natural causes or due to anthropogenic (human) causes, but the risks of extinction through natural disaster, such as a meteorite impact or large-scale volcanism, are generally considered to be comparatively low. Anthropogenic human extinction is sometimes called omnicide.
The Future of Humanity Institute (FHI) is an interdisciplinary research centre at the University of Oxford investigating big-picture questions about humanity and its prospects. It was founded in 2005 as part of the Faculty of Philosophy and the Oxford Martin School. Its director is philosopher Nick Bostrom, and its research staff and associates include futurist Anders Sandberg, engineer K. Eric Drexler, economist Robin Hanson, and Giving What We Can founder Toby Ord.
A nuclear holocaust, nuclear apocalypse or atomic holocaust is a theoretical scenario involving widespread destruction and radioactive fallout, causing the collapse of civilization, through the use of nuclear weapons. Under such a scenario, some or all of the Earth is made uninhabitable by nuclear warfare in future world wars.
In futurology, a singleton is a hypothetical world order in which there is a single decision-making agency at the highest level, capable of exerting effective control over its domain, and permanently preventing both internal and external threats to its supremacy. The term was first defined by Nick Bostrom.
The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price, Lord Martin Rees and Jaan Tallinn.
Superintelligence: Paths, Dangers, Strategies is a 2014 book by the Swedish philosopher Nick Bostrom from the University of Oxford. It argues that if machine brains surpass human brains in general intelligence, then this new superintelligence could replace humans as the dominant lifeform on Earth. Sufficiently intelligent machines could improve their own capabilities faster than human computer scientists, and the outcome could be an existential catastrophe for humans.
Feeding Everyone No Matter What: Managing Food Security After Global Catastrophe is a book written by David Denkenberger and Joshua M. Pearce and published by Elsevier under their Academic Press.
Existential risk from artificial general intelligence is the hypothesis that substantial progress in artificial general intelligence (AGI) could someday result in human extinction or some other unrecoverable global catastrophe. It is argued that the human species currently dominates other species because the human brain has some distinctive capabilities that other animals lack. If AI surpasses humanity in general intelligence and becomes "superintelligent", then it could become difficult or impossible for humans to control. Just as the fate of the mountain gorilla depends on human goodwill, so might the fate of humanity depend on the actions of a future machine superintelligence.
Biotechnology risk is a form of existential risk that could come from biological sources, such as genetically engineered biological agents. These can come either intentionally or unintentionally. A chapter in biotechnology and biosecurity was published in Nick Bostrom's Global Catastrophic Risks, which covered risks such as viral agents. Since then, new technologies like CRISPR and gene drives have been introduced.
Michael R. Rampino is a Geologist and Professor of Biology and Environmental Studies at New York University, known for his scientific contributions on causes of mass extinctions of life. Along with colleagues, he's developed theories about periodic mass extinctions being strongly related to the earth’s position in relation to the galaxy. "The solar system and its planets experience cataclysms every time they pass "up" or "down" through the plane of the disk-shaped galaxy." These ~30 million year cyclical breaks are an important factor in evolutionary theory, along with other longer 60-million- and 140-million-year cycles potentially caused by mantle plumes within the planet, opining "The Earth seems to have a pulse," He is also a research consultant at NASA’s Goddard Institute for Space Studies (GISS) in New York City.
On the Future: Prospects for Humanity is a 2018 nonfiction book by British cosmologist and Astronomer Royal Martin Rees. It is a short, "big concept" book on the future of humanity and on potential dangers, such as nuclear warfare, climate change, biotech, and artificial intelligence, and the possibility of human extinction.
Solutions to global issues generally require cooperation among nations.
A climate apocalypse is a hypothetical scenario involving the global collapse of human civilization and potential human extinction as either a direct or indirect result of anthropogenic climate change. The scenario posits some or all of the Earth being rendered uninhabitable as a result of extreme temperatures, severe weather events, an inability to grow crops, and an altered composition of the Earth's atmosphere.
The Precipice: Existential Risk and the Future of Humanity is a 2020 non-fiction book by the Australian philosopher Toby Ord, of the Future of Humanity Institute in Oxford. It argues that safeguarding humanity's future is among the most important moral issues of our time.
End Times: A Brief Guide to the End of the World is a 2019 non-fiction book by journalist Bryan Walsh. The book discusses various risks of human extinction, including asteroids, volcanos, nuclear war, global warming, pathogens, biotech, AI, and extraterrestrial intelligence. The book includes interviews with astronomers, anthropologists, biologists, climatologists, geologists, and other scholars. The book advocates strongly for greater action.