Scenarios in which a global catastrophic risk creates harm have been widely discussed. Some sources of catastrophic risk are anthropogenic (caused by humans), such as global warming, [1] environmental degradation, and nuclear war. [2] Others are non-anthropogenic or natural, such as meteor impacts or supervolcanoes. The impact of these scenarios can vary widely, depending on the cause and the severity of the event, ranging from temporary economic disruption to human extinction. Many societal collapses have already happened throughout human history.
Experts at the Future of Humanity Institute at the University of Oxford and the Centre for the Study of Existential Risk at the University of Cambridge prioritize anthropogenic over natural risks due to their much greater estimated likelihood. [3] [4] [5] [6] They are especially concerned by, and consequently focus on, risks posed by advanced technology, such as artificial intelligence and biotechnology. [7] [8]
The creators of a superintelligent entity could inadvertently give it goals that lead it to annihilate the human race. [9] [10] It has been suggested that if AI systems rapidly become super-intelligent, they may take unforeseen actions or out-compete humanity. [11] According to philosopher Nick Bostrom, it is possible that the first super-intelligence to emerge would be able to bring about almost any possible outcome it valued, as well as to foil virtually any attempt to prevent it from achieving its objectives. [12] Thus, even a super-intelligence indifferent to humanity could be dangerous if it perceived humans as an obstacle to unrelated goals. In Bostrom's book Superintelligence , he defines this as the control problem. [13] Physicist Stephen Hawking, Microsoft founder Bill Gates, and SpaceX founder Elon Musk have echoed these concerns, with Hawking theorizing that such an AI could "spell the end of the human race". [14]
In 2009, the Association for the Advancement of Artificial Intelligence (AAAI) hosted a conference to discuss whether computers and robots might be able to acquire any sort of autonomy, and how much these abilities might pose a threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved "cockroach intelligence". They noted that self-awareness, as depicted in science-fiction, is probably unlikely, but there are other potential hazards and pitfalls. [15] Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns. [16] [17]
A survey of AI experts estimated that the chance of human-level machine learning having an "extremely bad (e.g., human extinction)" long-term effect on humanity is 5%. [18] A 2008 survey by the Future of Humanity Institute estimated a 5% probability of extinction by super-intelligence by 2100. [19] Eliezer Yudkowsky believes risks from artificial intelligence are harder to predict than any other known risks due to bias from anthropomorphism. Since people base their judgments of artificial intelligence on their own experience, he claims they underestimate the potential power of AI. [20]
Biotechnology can pose a global catastrophic risk in the form of bioengineered organisms (viruses, bacteria, fungi, plants, or animals). In many cases the organism will be a pathogen of humans, livestock, crops, or other organisms we depend upon (e.g. pollinators or gut bacteria). However, any organism able to catastrophically disrupt ecosystem functions, e.g. highly competitive weeds, outcompeting essential crops, poses a biotechnology risk.
A biotechnology catastrophe may be caused by accidentally releasing a genetically engineered organism from controlled environments, by the planned release of such an organism which then turns out to have unforeseen and catastrophic interactions with essential natural or agro-ecosystems, or by intentional usage of biological agents in biological warfare or bioterrorism attacks. [21] Pathogens may be intentionally or unintentionally genetically modified to change virulence and other characteristics. [21] For example, a group of Australian researchers unintentionally changed characteristics of the mousepox virus while trying to develop a virus to sterilize rodents. [21] The modified virus became highly lethal even in vaccinated and naturally resistant mice. [22] [23] The technological means to genetically modify virus characteristics are likely to become more widely available in the future if not properly regulated. [21]
Biological [24] weapons, whether used in war or terrorism, could result in human extinction. Terrorist applications of biotechnology have historically been infrequent. To what extent this is due to a lack of capabilities or motivation is not resolved. [21] However, given current development, more risk from novel, engineered pathogens is to be expected in the future. [21] Exponential growth has been observed in the biotechnology sector, and Noun and Chyba predict that this will lead to major increases in biotechnological capabilities in the coming decades. [21] They argue that risks from biological warfare and bioterrorism are distinct from nuclear and chemical threats because biological pathogens are easier to mass-produce and their production is hard to control (especially as the technological capabilities are becoming available even to individual users). [21] In 2008, a survey by the Future of Humanity Institute estimated a 2% probability of extinction from engineered pandemics by 2100. [19]
Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks, and developing facilities to mitigate disease outbreaks (e.g. better and/or more widely distributed vaccines). [21]
By contrast with nuclear and biological weapons, chemical warfare, while able to create multiple local catastrophes, is unlikely to create a global one.[ citation needed ]
Population decline through a preference for fewer children. [25] If developing world demographics are assumed to become developed world demographics, and if the latter are extrapolated, some projections suggest an extinction before the year 3000.[ citation needed ] John A. Leslie estimates that if the reproduction rate drops to the German or Japanese level the extinction date will be 2400. [lower-alpha 1] However, some models suggest the demographic transition may reverse itself due to evolutionary biology. [26] [27]
Human-caused climate change has been driven by technology since the 19th century or earlier. Projections of future climate change suggest further global warming, sea level rise, and an increase in the frequency and severity of some extreme weather events and weather-related disasters. Effects of global warming include loss of biodiversity, stresses to existing food-producing systems, increased spread of known infectious diseases such as malaria, and rapid mutation of microorganisms.
A common belief is that the current climate crisis could spiral into human extinction. [30] [31] In November 2017, a statement by 15,364 scientists from 184 countries indicated that increasing levels of greenhouse gases from use of fossil fuels, human population growth, deforestation, and overuse of land for agricultural production, particularly by farming ruminants for meat consumption, are trending in ways that forecast an increase in human misery over coming decades. [32] An October 2017 report published in The Lancet stated that toxic air, water, soils, and workplaces were collectively responsible for nine million deaths worldwide in 2015, particularly from air pollution which was linked to deaths by increasing susceptibility to non-infectious diseases, such as heart disease, stroke, and lung cancer. [33] The report warned that the pollution crisis was exceeding "the envelope on the amount of pollution the Earth can carry" and "threatens the continuing survival of human societies". [33] Carl Sagan and others have raised the prospect of extreme runaway global warming turning Earth into an uninhabitable Venus-like planet. Some scholars argue that much of the world would become uninhabitable under severe global warming, but even these scholars do not tend to argue that it would lead to complete human extinction, according to Kelsey Piper of Vox . All the IPCC scenarios, including the most pessimistic ones, predict temperatures compatible with human survival. The question of human extinction under "unlikely" outlier models is not generally addressed by the scientific literature. [34] Factcheck.org judges that climate change fails to pose an established "existential risk", stating: "Scientists agree climate change does pose a threat to humans and ecosystems, but they do not envision that climate change will obliterate all people from the planet." [35] [36]
Cyberattacks have the potential to destroy everything from personal data to electric grids. Christine Peterson, co-founder and past president of the Foresight Institute, believes a cyberattack on electric grids has the potential to be a catastrophic risk. She notes that little has been done to mitigate such risks, and that mitigation could take several decades of readjustment. [37]
An environmental or ecological disaster, such as world crop failure and collapse of ecosystem services, could be induced by the present trends of overpopulation, economic development, and non-sustainable agriculture. Most environmental scenarios involve one or more of the following: Holocene extinction event, [38] scarcity of water that could lead to approximately half the Earth's population being without safe drinking water, pollinator decline, overfishing, massive deforestation, desertification, climate change, or massive water pollution episodes. Detected in the early 21st century, a threat in this direction is colony collapse disorder, [39] a phenomenon that might foreshadow the imminent extinction [40] of the Western honeybee. As the bee plays a vital role in pollination, its extinction would severely disrupt the food chain.
An October 2017 report published in The Lancet stated that toxic air, water, soils, and workplaces were collectively responsible for nine million deaths worldwide in 2015, particularly from air pollution which was linked to deaths by increasing susceptibility to non-infectious diseases, such as heart disease, stroke, and lung cancer. [33] The report warned that the pollution crisis was exceeding "the envelope on the amount of pollution the Earth can carry" and "threatens the continuing survival of human societies". [33]
A May 2020 analysis published in Scientific Reports found that if deforestation and resource consumption continue at current rates they could culminate in a "catastrophic collapse in human population" and possibly "an irreversible collapse of our civilization" within the next several decades. The study says humanity should pass from a civilization dominated by the economy to a "cultural society" that "privileges the interest of the ecosystem above the individual interest of its components, but eventually in accordance with the overall communal interest." The authors also note that "while violent events, such as global war or natural catastrophic events, are of immediate concern to everyone, a relatively slow consumption of the planetary resources may be not perceived as strongly as a mortal danger for the human civilization." [41]
Some scenarios envision that humans could use genetic engineering or technological modifications to split into normal humans and a new species – posthumans. [42] [43] [44] [45] [46] [47] [48] [49] Such a species could be fundamentally different from any previous life form on Earth, e.g. by merging humans with technological systems. [50] Such scenarios assess the risk that the "old" human species will be outcompeted and driven to extinction by the new, posthuman entity. [51]
Nick Bostrom suggested that in the pursuit of knowledge, humanity might inadvertently create a device that could destroy Earth and the Solar System. [52] Investigations in nuclear and high-energy physics could create unusual conditions with catastrophic consequences. All of these worries have so far proven unfounded.
For example, scientists worried that the first nuclear test might ignite the atmosphere. [53] [54] Early in the development of thermonuclear weapons there were some concerns that a fusion reaction could "ignite" the atmosphere in a chain reaction that would engulf Earth. Calculations showed the energy would dissipate far too quickly to sustain a reaction. [55]
Others worried that the RHIC [56] or the Large Hadron Collider might start a chain-reaction global disaster involving black holes, strangelets, or false vacuum states. [57] It has been pointed out that much more energetic collisions take place currently in Earth's atmosphere. [58] [59] [60]
Though these particular concerns have been challenged, [61] [62] [63] [64] the general concern about new experiments remains.
Romanian American economist Nicholas Georgescu-Roegen, a progenitor in economics and the paradigm founder of ecological economics, has argued that the carrying capacity of Earth—that is, Earth's capacity to sustain human populations and consumption levels—is bound to decrease sometime in the future as Earth's finite stock of mineral resources is presently being extracted and put to use; and consequently, that the world economy as a whole is heading towards an inevitable future collapse, leading to the demise of human civilization itself. [65] : 303f Ecological economist and steady-state theorist Herman Daly, a student of Georgescu-Roegen, has propounded the same argument by asserting that "all we can do is to avoid wasting the limited capacity of creation to support present and future life [on Earth]." [66] : 370
Ever since Georgescu-Roegen and Daly published these views, various scholars in the field have been discussing the existential impossibility of allocating Earth's finite stock of mineral resources evenly among an unknown number of present and future generations. This number of generations is likely to remain unknown to us, as there is no way—or only little way—of knowing in advance if or when mankind will ultimately face extinction. In effect, any conceivable intertemporal allocation of the stock will inevitably end up with universal economic decline at some future point. [67]
Many nanoscale technologies are in development or currently in use. [68] The only one that appears to pose a significant global catastrophic risk is molecular manufacturing, a technique that would make it possible to build complex structures at atomic precision. [69] Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories of desktop proportions. [68] [69] When nanofactories gain the ability to produce other nanofactories, production may only be limited by relatively abundant factors such as input materials, energy and software. [68]
Molecular manufacturing could be used to cheaply produce, among many other products, highly advanced, durable weapons. [68] Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities. [68]
Chris Phoenix and Treder classify catastrophic risks posed by nanotechnology into three categories:
Several researchers say the bulk of risk from nanotechnology comes from the potential to lead to war, arms races, and destructive global government. [22] [68] [70] Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races):
Since self-regulation by all state and non-state actors seems hard to achieve, [72] measures to mitigate war-related risks have mainly been proposed in the area of international cooperation. [68] [73] International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control. International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed. [73] One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour. [68] The Center for Responsible Nanotechnology also suggests some technical restrictions. [74] Improved transparency regarding technological capabilities may be another important facilitator for arms-control.
Gray goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creation [75] and has been a theme in mainstream media and fiction. [76] [77] This scenario involves tiny self-replicating robots that consume the entire biosphere (ecophagy) using it as a source of energy and building blocks. Nowadays, however, nanotech experts—including Drexler—discredit the scenario. According to Phoenix, a "so-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident". [78]
Some fear a hypothetical World War III could cause the annihilation of humankind. [79] [80] Nuclear war could yield unprecedented human death tolls and habitat destruction. Detonating large numbers of nuclear weapons would have an immediate, short term and long-term effects on the climate, potentially causing cold weather known as a "nuclear winter" [81] with reduced sunlight and photosynthesis [82] that may generate significant upheaval in advanced civilizations. [83] However, while popular perception sometimes takes nuclear war as "the end of the world", experts assign low probability to human extinction from nuclear war. [84] [85] In 1982, Brian Martin estimated that a US–Soviet nuclear exchange might kill 400–450 million directly, mostly in the United States, Europe and Russia, and maybe several hundred million more through follow-up consequences in those same areas. [84] In 2008, a survey by the Future of Humanity Institute estimated a 4% probability of extinction from warfare by 2100, with a 1% chance of extinction from nuclear warfare. [19]
The scenarios that have been explored most frequently are nuclear warfare and doomsday devices. Mistakenly launching a nuclear attack in response to a false alarm is one possible scenario; this nearly happened during the 1983 Soviet nuclear false alarm incident. Although the probability of a nuclear war per year is slim, Professor Martin Hellman has described it as inevitable in the long run; unless the probability approaches zero, inevitably there will come a day when civilization's luck runs out. [86] During the Cuban Missile Crisis, U.S. president John F. Kennedy estimated the odds of nuclear war at "somewhere between one out of three and even". [87] The United States and Russia have a combined arsenal of 14,700 nuclear weapons, [88] and there is an estimated total of 15,700 nuclear weapons in existence worldwide. [88]
The Global Footprint Network estimates that current activity uses resources twice as fast as they can be naturally replenished, and that growing human population and increased consumption pose the risk of resource depletion and a concomitant population crash. [89] Evidence suggests birth rates may be rising in the 21st century in the developed world. [26] Projections vary; researcher Hans Rosling has projected population growth to start to plateau around 11 billion, and then to slowly grow or possibly even shrink thereafter. [90] A 2014 study published in Science asserts that the human population will grow to around 11 billion by 2100 and that growth will continue into the next century. [91]
The 20th century saw a rapid increase in human population due to medical developments [ broken anchor ] and massive increases in agricultural productivity [92] such as the Green Revolution. [93] Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. The Green Revolution in agriculture helped food production to keep pace with worldwide population growth or actually enabled population growth. The energy for the Green Revolution was provided by fossil fuels in the form of fertilizers (natural gas), pesticides (oil), and hydrocarbon-fueled irrigation. [94] David Pimentel, professor of ecology and agriculture at Cornell University, and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in their 1994 study Food, Land, Population and the U.S. Economy the maximum U.S. population for a sustainable economy at 200 million. To achieve a sustainable economy and avert disaster, the United States must reduce its population by at least one-third, and world population will have to be reduced by two-thirds, says the study. [95]
The authors of this study believe the mentioned agricultural crisis will begin to have an effect on the world after 2020 and will become critical after 2050. Geologist Dale Allen Pfeiffer claims that coming decades could see spiraling food prices without relief and massive starvation on a global level such as never experienced before. [96] [97]
Since supplies of petroleum and natural gas are essential to modern agriculture techniques, a fall in global oil supplies (see peak oil for global concerns) could cause spiking food prices and unprecedented famine in the coming decades. [98] [99]
Wheat is humanity's third-most-produced cereal. Extant fungal infections such as Ug99 [100] (a kind of stem rust) can cause 100% crop losses in most modern varieties. Little or no treatment is possible and the infection spreads on the wind. Should the world's large grain-producing areas become infected, the ensuing crisis in wheat availability would lead to price spikes and shortages in other food products. [101]
Human activity has triggered an extinction event often referred to as the sixth "mass extinction", [102] [103] [104] [105] which scientists consider a major threat to the continued existence of human civilization. [106] [107] The 2019 Global Assessment Report on Biodiversity and Ecosystem Services , published by the United Nations' Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, asserts that roughly one million species of plants and animals face extinction from human impacts such as expanding land use for industrial agriculture and livestock rearing, along with overfishing. [108] [109] [110] A 1997 assessment states that over a third of Earth's land has been modified by humans, that atmospheric carbon dioxide has increased around 30 percent, that humans are the dominant source of nitrogen fixation, that humans control most of the Earth's accessible surface fresh water, and that species extinction rates may be over a hundred times faster than normal. [111] Ecological destruction which impacts food production could produce a human population crash.
Of all species that have ever lived, 99% have gone extinct. [112] Earth has experienced numerous mass extinction events, in which up to 96% of all species present at the time were eliminated. [112] A notable example is the K-T extinction event, which killed the dinosaurs. The types of threats posed by nature have been argued to be relatively constant, though this has been disputed. [113] A number of other astronomical threats have also been identified. [114] [115] [116]
An impact event involving a near-Earth object (NEOs) could result in localized or widespread destruction, including widespread extinction and possibly human extinction. [117] [118] [119]
Several asteroids have collided with Earth in recent geological history. The Chicxulub asteroid, for example, was about ten kilometers (six miles) in diameter and is theorized to have caused the extinction of non-avian dinosaurs at the end of the Cretaceous. No sufficiently large asteroid currently exists in an Earth-crossing orbit; however, a comet of sufficient size to cause human extinction could impact the Earth, though the annual probability may be less than 10−8. [120] Geoscientist Brian Toon estimates that while a few people, such as "some fishermen in Costa Rica", could plausibly survive a ten-kilometer (six-mile) meteorite, a hundred-kilometer (sixty-mile) meteorite would be large enough to "incinerate everybody". [121] Asteroids with around a 1 km diameter have impacted the Earth on average once every 500,000 years; these are probably too small to pose an extinction risk, but might kill billions of people. [120] [122] Larger asteroids are less common. Small near-Earth asteroids are regularly observed and can impact anywhere on the Earth injuring local populations. [123] As of 2013, Spaceguard estimates it has identified 95% of all NEOs over 1 km in size. [124] None of the large "dinosaur-killer" asteroids known to Spaceguard pose a near-term threat of collision with Earth. [125]
In April 2018, the B612 Foundation reported "It's a 100 per cent certain we'll be hit [by a devastating asteroid], but we're not 100 per cent sure when." [126] Also in 2018, physicist Stephen Hawking, in his final book Brief Answers to the Big Questions , considered an asteroid collision to be the biggest threat to the planet. [127] [128] In June 2018, the US National Science and Technology Council warned that America is unprepared for an asteroid impact event, and has developed and released the "National Near-Earth Object Preparedness Strategy Action Plan" to better prepare. [129] [130] [131] [132] [133] According to expert testimony in the United States Congress in 2013, NASA would require at least five years of preparation before a mission to intercept an asteroid could be launched. [134]
In 2024, NASA has been given a 72% chance of hypothetical meteor impact to Earth from a possible 100-320 meter wide asteroid by July 12, 2038. They also estimated a 47% chance of a 1000 people killed by an asteroid by that date. However, They estimated a 53% chance of 1,000 people to survive a meteor impact, as mitigation is allowed to support them by that date. [135]
In April 2008, it was announced that two simulations of long-term planetary movement, one at the Paris Observatory and the other at the University of California, Santa Cruz, indicate a 1% chance that Mercury's orbit could be made unstable by Jupiter's gravitational pull sometime during the lifespan of the Sun. Were this to happen, the simulations suggest a collision with Earth could be one of four possible outcomes (the others being Mercury colliding with the Sun, colliding with Venus, or being ejected from the Solar System altogether).
Collision with or a near miss by a large object from outside the Solar System could also be catastrophic to life on Earth. Interstellar objects, including asteroids, comets, and rogue planets, are difficult to detect with current technology until they enter the Solar System, and could potentially do so at high speed.
If Mercury or a rogue planet of similar size were to collide with Earth, all life on Earth could be obliterated entirely: an asteroid 15 km wide is believed to have caused the extinction of the non-avian dinosaurs, whereas Mercury is 4,879 km in diameter. [136] The destabilization of Mercury's orbit is unlikely in the foreseeable future. [137]
A close pass by a large object could cause massive tidal forces that triggered anything from minor earthquakes to liquification of the Earth's crust to Earth being torn apart, becoming a disrupted planet.
Stars and black holes are easier to detect from a longer distance, but are much more difficult to deflect. Passage through the solar system could result in the destruction of the Earth or the Sun by being directly consumed. Astronomers expect the collision of the Milky Way Galaxy with the Andromeda Galaxy in about four billion years, but due to the large amount of empty space between them, most stars are not expected to collide directly. [138]
The passage of another star system into or close to the outer reaches of the Solar System could trigger a swarm of asteroid impacts as the orbit of objects in the Oort Cloud is disturbed, or objects orbiting the two stars collide. It also increases the risk of catastrophic irradiation of the Earth. Astronomers have identified fourteen stars with a 90% chance of coming within 3.26 light years of the Sun in the next few million years, and four within 1.6 light years, including HIP 85605 and Gliese 710. [139] [140] Observational data on nearby stars was too incomplete for a full catalog of near misses, but more data is being collected by the Gaia spacecraft. [141]
Strangelets, if they exist, might naturally be produced by strange stars, and in the case of a collision, might escape and hit the Earth. Likewise, a false vacuum collapse could be triggered elsewhere in the universe.
Another interstellar threat is a gamma-ray burst, typically produced by a supernova when a star collapses inward on itself and then "bounces" outward in a massive explosion. Under certain circumstances, these events are thought to produce massive bursts of gamma radiation emanating outward from the axis of rotation of the star. If such an event were to occur oriented towards the Earth, the massive amounts of gamma radiation could significantly affect the Earth's atmosphere and pose an existential threat to all life. Such a gamma-ray burst may have been the cause of the Ordovician–Silurian extinction events. This scenario is unlikely in the foreseeable future. [137] Astroengineering projects proposed to mitigate the risk of gamma-ray bursts include shielding the Earth with ionised smartdust and star lifting of nearby high mass stars likely to explode in a supernova. [142] A gamma-ray burst would be able to vaporize anything in its beams out to around 200 light-years. [143] [144]
A powerful solar flare, solar superstorm or a solar micronova, which is a drastic and unusual decrease or increase in the Sun's power output, could have severe consequences for life on Earth. [145] [146]
The Earth will naturally become uninhabitable due to the Sun's stellar evolution, within about a billion years. [148] In around 1 billion years from now, the Sun's brightness may increase as a result of a shortage of hydrogen, and the heating of its outer layers may cause the Earth's oceans to evaporate, leaving only minor forms of life. [149] Well before this time, the level of carbon dioxide in the atmosphere will be too low to support plant life, destroying the foundation of the food chains. [150] See Future of the Earth.
About 7–8 billion years from now, if and after the Sun has become a red giant, the Earth will probably be engulfed by an expanding Sun and destroyed. [151] [152] [153] [154]
The ultimate fate of the universe is uncertain, but is likely to eventually become uninhabitable, either suddenly or gradually. If it does not collapse into the Big Crunch, over very long time scales the heat death of the universe may render life impossible. [155] [156] The expansion of spacetime could cause the destruction of all matter in a Big Rip scenario.
If our universe lies within a false vacuum, a bubble of lower-energy vacuum could come to exist by chance or otherwise in our universe, and catalyze the conversion of our universe to a lower energy state in a volume expanding at nearly the speed of light, destroying all that is known without forewarning. Such an occurrence is called vacuum decay, [157] [158] or the "Big Slurp".
Intelligent extraterrestrial life, if it exists, could invade Earth, either to exterminate and supplant human life, enslave it under a colonial system, exploit the planet's resources, or destroy it altogether. [159]
Although the existence of sentient alien life has never been conclusively proven, scientists such as Carl Sagan have posited it to be very likely. Scientists consider such a scenario technically possible, but unlikely. [160]
An article in The New York Times Magazine discussed the possible threats for humanity of intentionally sending messages aimed at extraterrestrial life into the cosmos in the context of the SETI efforts. Several public figures such as Stephen Hawking and Elon Musk have argued against sending such messages, on the grounds that extraterrestrial civilizations with technology are probably far more advanced than, and could therefore pose an existential threat to, humanity. [161]
Invasion by microscopic life is also a possibility. In 1969, the "Extra-Terrestrial Exposure Law" was added to the United States Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the U.S. Apollo Space Program. It was removed in 1991. [162]
A pandemic [163] involving one or more viruses, prions, or antibiotic-resistant bacteria. Epidemic diseases that have killed millions of people include smallpox, bubonic plague, influenza, HIV/AIDS, COVID-19, cocoliztli, typhus, and cholera. Endemic tuberculosis and malaria kill over a million people each year. Sudden introduction of various European viruses decimated indigenous American populations. A deadly pandemic restricted to humans alone would be self-limiting as its mortality would reduce the density of its target population. A pathogen with a broad host range in multiple species, however, could eventually reach even isolated human populations. [164] U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity", if left unchecked, is technically feasible and that the technical obstacles are "trivial". However, they are confident that in practice, countries would be able to "recognize and intervene effectively" to halt the spread of such a microbe and prevent human extinction. [165]
There are numerous historical examples of pandemics [166] that have had a devastating effect on a large number of people. The present, unprecedented scale and speed of human movement make it more difficult than ever to contain an epidemic through local quarantines, and other sources of uncertainty and the evolving nature of the risk mean natural pandemics may pose a realistic threat to human civilization. [113]
There are several classes of argument about the likelihood of pandemics. One stems from history, where the limited size of historical pandemics is evidence that larger pandemics are unlikely. This argument has been disputed on grounds including the changing risk due to changing population and behavioral patterns among humans, the limited historical record, and the existence of an anthropic bias. [113]
Another argument is based on an evolutionary model that predicts that naturally evolving pathogens will ultimately develop an upper limit to their virulence. [167] This is because pathogens with high enough virulence quickly kill their hosts and reduce their chances of spreading the infection to new hosts or carriers. [168] This model has limits, however, because the fitness advantage of limited virulence is primarily a function of a limited number of hosts. Any pathogen with a high virulence, high transmission rate and long incubation time may have already caused a catastrophic pandemic before ultimately virulence is limited through natural selection. Additionally, a pathogen that infects humans as a secondary host and primarily infects another species (a zoonosis) has no constraints on its virulence in people, since the accidental secondary infections do not affect its evolution. [169] Lastly, in models where virulence level and rate of transmission are related, high levels of virulence can evolve. [170] Virulence is instead limited by the existence of complex populations of hosts with different susceptibilities to infection, or by some hosts being geographically isolated. [167] The size of the host population and competition between different strains of pathogens can also alter virulence. [171]
Neither of these arguments is applicable to bioengineered pathogens, and this poses entirely different risks of pandemics. Experts have concluded that "Developments in science and technology could significantly ease the development and use of high consequence biological weapons", and these "highly virulent and highly transmissible [bio-engineered pathogens] represent new potential pandemic threats". [172]
Climate change refers to a lasting change in the Earth's climate. The climate has ranged from ice ages to warmer periods when palm trees grew in Antarctica. It has been hypothesized that there was also a period called "snowball Earth" when all the oceans were covered in a layer of ice. These global climatic changes occurred slowly, near the end of the last Major Ice Age when the climate became more stable. However, abrupt climate change on the decade time scale has occurred regionally. A natural variation into a new climate regime (colder or hotter) could pose a threat to civilization. [173] [174]
In the history of the Earth, many Ice Ages are known to have occurred. An ice age would have a serious impact on civilization because vast areas of land (mainly in North America, Europe, and Asia) could become uninhabitable. Currently, the world is in an Interglacial period within a much older glacial event. The last glacial expansion ended about 10,000 years ago, and all civilizations evolved later than this. Scientists do not predict that a natural ice age will occur anytime soon.[ citation needed ] The amount of heat-trapping gases emitted into Earth's oceans and atmosphere will prevent the next ice age, which otherwise would begin in around 50,000 years, and likely more glacial cycles. [175] [176]
On a long time scale, natural shifts such as Milankovitch cycles (hypothesized quaternary climatic oscillations) could create unknown climate variability and change. [177]
A geological event such as massive flood basalt, volcanism, or the eruption of a supervolcano [178] could lead to a so-called volcanic winter, similar to a nuclear winter. Human extinction is a possibility. [179] One such event, the Toba eruption, [180] occurred in Indonesia about 71,500 years ago. According to the Toba catastrophe theory, [181] the event may have reduced human populations to only a few tens of thousands of individuals. Yellowstone Caldera is another such supervolcano, having undergone 142 or more caldera-forming eruptions in the past 17 million years. [182] A massive volcano eruption would eject extraordinary volumes of volcanic dust, toxic and greenhouse gases into the atmosphere with serious effects on global climate (towards extreme global cooling: volcanic winter if short-term, and ice age if long-term) or global warming (if greenhouse gases were to prevail).
When the supervolcano at Yellowstone last erupted 640,000 years ago, the thinnest layers of the ash ejected from the caldera spread over most of the United States west of the Mississippi River and part of northeastern Mexico. The magma covered much of what is now Yellowstone National Park and extended beyond, covering much of the ground from Yellowstone River in the east to Idaho falls in the west, with some of the flows extending north beyond Mammoth Springs. [183]
According to a recent study, if the Yellowstone caldera erupted again as a supervolcano, an ash layer one to three millimeters thick could be deposited as far away as New York, enough to "reduce traction on roads and runways, short out electrical transformers and cause respiratory problems". There would be centimeters of thickness over much of the U.S. Midwest, enough to disrupt crops and livestock, especially if it happened at a critical time in the growing season. The worst-affected city would likely be Billings, Montana, population 109,000, which the model predicted would be covered with ash estimated as 1.03 to 1.8 meters thick. [184]
The main long-term effect is through global climate change, which reduces the temperature globally by about 5–15 °C for a decade, together with the direct effects of the deposits of ash on their crops. A large supervolcano like Toba would deposit one or two meters thickness of ash over an area of several million square kilometers. (1000 cubic kilometers is equivalent to a one-meter thickness of ash spread over a million square kilometers). If that happened in some densely populated agricultural area, such as India, it could destroy one or two seasons of crops for two billion people. [185]
However, Yellowstone shows no signs of a supereruption at present, and it is not certain that a future supereruption will occur. [186] [187]
Research published in 2011 finds evidence that massive volcanic eruptions caused massive coal combustion, supporting models for the significant generation of greenhouse gases. Researchers have suggested that massive volcanic eruptions through coal beds in Siberia would generate significant greenhouse gases and cause a runaway greenhouse effect. [188] Massive eruptions can also throw enough pyroclastic debris and other material into the atmosphere to partially block out the sun and cause a volcanic winter, as happened on a smaller scale in 1816 following the eruption of Mount Tambora, the so-called Year Without a Summer. Such an eruption might cause the immediate deaths of millions of people several hundred kilometers (or miles) from the eruption, and perhaps billions of death worldwide, due to the failure of the monsoons, [189] resulting in major crop failures causing starvation on a profound scale. [189]
A much more speculative concept is the verneshot: a hypothetical volcanic eruption caused by the buildup of gas deep underneath a craton. Such an event may be forceful enough to launch an extreme amount of material from the crust and mantle into a sub-orbital trajectory.
A supervolcano is a volcano that has had an eruption with a volcanic explosivity index (VEI) of 8, the largest recorded value on the index. This means the volume of deposits for such an eruption is greater than 1,000 cubic kilometers.
Nick Bostrom is a philosopher known for his work on existential risk, the anthropic principle, human enhancement ethics, whole brain emulation, superintelligence risks, and the reversal test. He was the founding director of the now dissolved Future of Humanity Institute at the University of Oxford and is now Principal Researcher at the Macrostrategy Research Initiative.
Space and survival is the idea that the long-term survival of the human species and technological civilization requires the building of a spacefaring civilization that utilizes the resources of outer space, and that not doing this might lead to human extinction. A related observation is that the window of opportunity for doing this may be limited due to the decreasing amount of surplus resources that will be available over time as a result of an ever-growing population.
The Great Filter is the idea that, in the development of life from the earliest stages of abiogenesis to reaching the highest levels of development on the Kardashev scale, there is a barrier to development that makes detectable extraterrestrial life exceedingly rare. The Great Filter is one possible resolution of the Fermi paradox.
Human extinction is the hypothetical end of the human species, either by population decline due to extraneous natural causes, such as an asteroid impact or large-scale volcanism, or via anthropogenic destruction (self-extinction), for example by sub-replacement fertility.
The Future of Humanity Institute (FHI) was an interdisciplinary research centre at the University of Oxford investigating big-picture questions about humanity and its prospects. It was founded in 2005 as part of the Faculty of Philosophy and the Oxford Martin School. Its director was philosopher Nick Bostrom, and its research staff included futurist Anders Sandberg and Giving What We Can founder Toby Ord.
A nuclear holocaust, also known as a nuclear apocalypse, nuclear annihilation, nuclear armageddon, or atomic holocaust, is a theoretical scenario where the mass detonation of nuclear weapons causes widespread destruction and radioactive fallout. Such a scenario envisages large parts of the Earth becoming uninhabitable due to the effects of nuclear warfare, potentially causing the collapse of civilization, the extinction of humanity, and/or the termination of most biological life on Earth.
A global catastrophic risk or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk".
The biological and geological future of Earth can be extrapolated based on the estimated effects of several long-term influences. These include the chemistry at Earth's surface, the cooling rate of the planet's interior, the gravitational interactions with other objects in the Solar System, and a steady increase in the Sun's luminosity. An uncertain factor is the pervasive influence of technology introduced by humans, such as climate engineering, which could cause significant changes to the planet. For example, the current Holocene extinction is being caused by technology, and the effects may last for up to five million years. In turn, technology may result in the extinction of humanity, leaving the planet to gradually return to a slower evolutionary pace resulting solely from long-term natural processes.
In futurology, a singleton is a hypothetical world order in which there is a single decision-making agency at the highest level, capable of exerting effective control over its domain, and permanently preventing both internal and external threats to its supremacy. The term was first defined by Nick Bostrom.
The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price, Martin Rees and Jaan Tallinn.
Superintelligence: Paths, Dangers, Strategies is a 2014 book by the philosopher Nick Bostrom. It explores how superintelligence could be created and what its features and motivations might be. It argues that superintelligence, if created, would be difficult to control, and that it could take over the world in order to accomplish its goals. The book also presents strategies to help make superintelligences whose goals benefit humanity. It was particularly influential for raising concerns about existential risk from artificial intelligence.
Feeding Everyone No Matter What: Managing Food Security After Global Catastrophe is a 2014 book by David Denkenberger and Joshua M. Pearce and published by Elsevier under their Academic Press.
Existential risk from artificial general intelligence refers to the idea that substantial progress in artificial general intelligence (AGI) could lead to human extinction or an irreversible global catastrophe.
Global Catastrophic Risks is a 2008 non-fiction book edited by philosopher Nick Bostrom and astronomer Milan M. Ćirković. The book is a collection of essays from 26 academics written about various global catastrophic and existential risks.
Biotechnology risk is a form of existential risk from biological sources, such as genetically engineered biological agents. The release of such high-consequence pathogens could be
The Precipice: Existential Risk and the Future of Humanity is a 2020 non-fiction book by the Australian philosopher Toby Ord, a senior research fellow at the Future of Humanity Institute in Oxford. It argues that humanity faces unprecedented risks over the next few centuries and examines the moral significance of safeguarding humanity's future.
End Times: A Brief Guide to the End of the World is a 2019 non-fiction book by journalist Bryan Walsh. The book discusses various risks of human extinction, including asteroids, volcanoes, nuclear war, global warming, pathogens, biotech, AI, and extraterrestrial intelligence. The book includes interviews with astronomers, anthropologists, biologists, climatologists, geologists, and other scholars. The book advocates strongly for greater action.
Longtermism is the ethical view that positively influencing the long-term future is a key moral priority of our time. It is an important concept in effective altruism and a primary motivation for efforts that aim to reduce existential risks to humanity.
Existential risk studies (ERS) is a field of studies focused on the definition and theorization of "existential risks", its ethical implications and the related strategies of long-term survival. Existential risks are diversely defined as global kinds of calamity that have the capacity of inducing the extinction of intelligent earthling life, such as humans, or, at least, a severe limitation of their potential, as defined by ERS theorists. The field development and expansion can be divided in waves according to its conceptual changes as well as its evolving relationship with related fields and theories, such as futures studies, disaster studies, AI safety, effective altruism and longtermism.
The great bulk of existential risk in the foreseeable future is anthropogenic; that is, arising from human activity.
Moreover, we have unleashed a mass extinction event, the sixth in roughly 540 million years, wherein many current life forms could be annihilated or at least committed to extinction by the end of this century.
Humanity is causing a rapid loss of biodiversity and, with it, Earth's ability to support complex life. But the mainstream is having difficulty grasping the magnitude of this loss, despite the steady erosion of the fabric of human civilization.
{{cite journal}}
: CS1 maint: numeric names: authors list (link)The USGS puts it like this: 'If another large caldera-forming eruption were to occur at Yellowstone, its effects would be worldwide. Thick ash deposits would bury vast areas of the United States, and an injection of huge volumes of volcanic gases into the atmosphere could drastically affect the global climate. Fortunately, the Yellowstone volcanic system shows no signs that it is headed toward such an eruption. The probability of a large caldera-forming eruption within the next few thousand years is exceedingly low.'
{{citation}}
: CS1 maint: numeric names: authors list (link).