Author | Toby Ord |
---|---|
Country | United Kingdom |
Language | English |
Subject | Existential risk |
Genre | Philosophy, popular science |
Publisher | Bloomsbury Publishing [1] Hachette Book Group [2] |
Publication date | 5 March 2020 (UK) 24 March 2020 (US) |
Media type | Print, e-book, audiobook |
Pages | 480 |
ISBN | 1526600218 |
Website | www.theprecipice.com |
The Precipice: Existential Risk and the Future of Humanity is a 2020 non-fiction book by the Australian philosopher Toby Ord, a senior research fellow at the Future of Humanity Institute in Oxford. It argues that humanity faces unprecedented risks over the next few centuries and examines the moral significance of safeguarding humanity's future.
Ord argues that humanity is in a uniquely dangerous period in its development, which he calls the Precipice. Beginning with the first atomic bomb test in 1945, the Precipice is characterized by unprecedented destructive capability paired with inadequate wisdom and restraint. Ord predicts that the Precipice is likely to last no more than a few centuries, as humanity will either quickly develop the necessary self-control or succumb to the rapidly accumulating risk of catastrophe. Ord estimates that the Cuban Missile Crisis in 1962, which leaders at the time thought had a 10–50% chance of causing nuclear war, was the closest humanity has yet come to self-destruction in its 200,000-year history.
Ord uses the concepts of existential catastrophe and existential risk, citing their definitions by Nick Bostrom. Existential catastrophe refers to the realized destruction of humanity's long-term potential, whereas existential risk refers to the probability that a given hazard will lead to existential catastrophe. Human extinction is one mechanism of existential catastrophe, but others can be imagined such as permanent totalitarian dystopia. This concept of existential catastrophe is strictly defined as a permanent, irreversible loss of potential; for example, even a disaster that killed a majority of humans would not be an existential catastrophe under this definition, provided that the survivors eventually recover and resume scientific and technological progress. Ord examines the immense moral implications of existential catastrophe from a variety of perspectives: existential catastrophe would simultaneously betray all that past humans have built, bring great harm upon humans existing at the time, and cut off the possibility of a vast future flourishing among the stars.
Ord estimates a 1 in 6 total risk of existential catastrophe occurring in the next century. This includes a relatively negligible existential risk from natural catastrophes such as asteroid impacts but is overwhelmingly dominated by anthropogenic (human-caused) existential risk. Ord estimates the existential risk associated with unaligned artificial general intelligence to be 1 in 10 over the next century, higher than all other sources of existential risk combined. Other anthropogenic existential risks include nuclear war, engineered pandemics, and climate change.
Ord states that humanity spends less than 0.001% of gross world product on targeted existential risk reduction interventions. He argues that motivation to fund such interventions is limited by insufficient global coordination, which could be improved via specialized global institutions. Moreover, interventions such as governance of dangerous emerging technologies may inherently require increased global coordination. Ord outlines a number of policy and research recommendations intended to reduce existential risk. He also explores several ways individuals can contribute to existential risk reduction, such as selecting high-impact careers, effective giving, and contributing to a public conversation on the issue.
A review in the Evening Standard called The Precipice a "a startling and rigorous contribution". [3] In The Spectator , Tom Chivers called The Precipice "a powerful book, written with a philosopher's eye for counterarguments so that he can meet them in advance. And Ord's love for humanity and hope for its future is infectious, as is his horrified wonder at how close we have come to destroying it". [4]
Writing in The Sunday Times , journalist and author Bryan Appleyard expressed skepticism toward some of the moral philosophy in the book, stating "I doubt that it can redirect humanity away from its self-destructive ways", but ultimately praised the book, calling it "dense and often thrillingly written" and highlighting Ord's analysis of the science as "exemplary". [5] Reviewer Steven Carroll in The Sydney Morning Herald called it authoritative and accessible. [6]
A review in The New Yorker published in April 2020 during the coronavirus pandemic noted that the book seemed "made for the present moment" and said "readers may find the sections that argue for why humanity deserves saving, and why we're equipped to face the challenges, even more arresting than the array of potential cataclysms". [7]
Nick Bostrom is a Swedish philosopher at the University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, whole brain emulation, superintelligence risks, and the reversal test. He is the founding director of the Future of Humanity Institute at Oxford University.
Space and survival is the idea that the long-term survival of the human species and technological civilization requires the building of a spacefaring civilization that utilizes the resources of outer space, and that not doing this will lead to human extinction. A related observation is that the window of opportunity for doing this may be limited due to the decreasing amount of surplus resources that will be available over time as a result of an ever-growing population.
Human extinction is the hypothetical end of the human species due to either natural causes such as population decline from sub-replacement fertility, an asteroid impact, large-scale volcanism, or via anthropogenic destruction (self-extinction).
Evidence-based policy is a concept in public policy that advocates for policy decisions to be grounded on, or influenced by, rigorously established objective evidence. This concept presents a stark contrast to policymaking predicated on ideology, 'common sense,' anecdotes, or personal intuitions. The approach mirrors the effective altruism movement's philosophy within governmental circles. The methodology employed in evidence-based policy often includes comprehensive research methods such as randomized controlled trials (RCT). Good data, analytical skills, and political support to the use of scientific information are typically seen as the crucial elements of an evidence-based approach.
Differential technological development is a strategy of technology governance aiming to decrease risks from emerging technologies by influencing the sequence in which they are developed. On this strategy, societies would strive to delay the development of harmful technologies and their applications, while accelerating the development of beneficial technologies, especially those that offer protection against the harmful ones.
The Future of Humanity Institute (FHI) is an interdisciplinary research centre at the University of Oxford investigating big-picture questions about humanity and its prospects. It was founded in 2005 as part of the Faculty of Philosophy and the Oxford Martin School. Its director is philosopher Nick Bostrom, and its research staff include futurist Anders Sandberg and Giving What We Can founder Toby Ord.
A precipice is a significant vertical rock exposure.
A global catastrophic risk or a doomsday scenario is a hypothetical future event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk."
Hilbrand Johannes "Hip" Groenewold (1910–1996) was a Dutch theoretical physicist who pioneered the largely operator-free formulation of quantum mechanics in phase space known as phase-space quantization.
Toby David Godfrey Ord is an Australian philosopher. In 2009 he founded Giving What We Can, an international society whose members pledge to donate at least 10% of their income to effective charities, and is a key figure in the effective altruism movement, which promotes using reason and evidence to help the lives of others as much as possible.
Scope neglect or scope insensitivity is a cognitive bias that occurs when the valuation of a problem is not valued with a multiplicative relationship to its size. Scope neglect is a specific form of extension neglect.
Effective altruism is a philosophical and social movement that advocates "using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis". Effective altruists may choose careers based on the amount of good that they expect the career to achieve or donate to charities based on the goal of maximising impact. The movement developed during the 2000s, and the name effective altruism was coined in 2011. Philosophers influential to the movement include Peter Singer, Toby Ord, and William MacAskill.
Jim Holt is an American journalist, popular-science author, and essayist. He has contributed to The New York Times, The New York Times Magazine, The New York Review of Books, The New Yorker, The American Scholar, and Slate. In 1997 he was editor of The New Leader, a political magazine. His book Why Does the World Exist? was a 2013 New York Times bestseller.
Superintelligence: Paths, Dangers, Strategies is a 2014 book by the philosopher Nick Bostrom. It explores how superintelligence could be created and what its features and motivations might be. It argues that superintelligence, if created, would be difficult to control, and that it could take over the world in order to accomplish its goals. The book also presents strategies to help make superintelligences whose goals benefit humanity. It was particularly influential for raising concerns about existential risk from artificial intelligence.
Existential risk from artificial general intelligence is the hypothesis that substantial progress in artificial general intelligence (AGI) could result in human extinction or an irreversible global catastrophe.
Global Catastrophic Risks is a 2008 non-fiction book edited by philosopher Nick Bostrom and astronomer Milan M. Ćirković. The book is a collection of essays from 26 academics written about various global catastrophic and existential risks.
End Times: A Brief Guide to the End of the World is a 2019 non-fiction book by journalist Bryan Walsh. The book discusses various risks of human extinction, including asteroids, volcanoes, nuclear war, global warming, pathogens, biotech, AI, and extraterrestrial intelligence. The book includes interviews with astronomers, anthropologists, biologists, climatologists, geologists, and other scholars. The book advocates strongly for greater action.
Longtermism is the ethical view that positively influencing the long-term future is a key moral priority of our time. It is an important concept in effective altruism and serves as a primary motivation for efforts that claim to reduce existential risks to humanity.
Scenarios in which a global catastrophic risk creates harm have been widely discussed. Some sources of catastrophic risk are anthropogenic, such as global warming, environmental degradation, and nuclear war. Others are non-anthropogenic or natural, such as meteor impacts or supervolcanoes. The impact of these scenarios can vary widely, depending on the cause and the severity of the event, ranging from temporary economic disruption to human extinction. Many societal collapses have already happened throughout human history.
What We Owe the Future is a 2022 book by the Scottish philosopher and ethicist William MacAskill, an associate professor in philosophy at the University of Oxford. It argues for effective altruism and the philosophy of longtermism, which MacAskill defines as "the idea that positively influencing the long-term future is a key moral priority of our time."