On the Future

Last updated
On the Future: Prospects for Humanity
On the Future.jpg
Author Martin Rees
CountryUnited Kingdom
LanguageEnglish
Genre Science
PublishedOctober 16, 2018 (Princeton University Press)
Media typePrint (hardcover)
Pages272
ISBN 9780691180441

On the Future: Prospects for Humanity is a 2018 nonfiction book by British cosmologist and Astronomer Royal Martin Rees. [1] It is a short, "big concept" book on the future of humanity and on potential dangers, such as nuclear warfare, climate change, biotech, and artificial intelligence, and the possibility of human extinction. [2]

Contents

Ideas

As with his 2003 Our Final Century , Rees warns that human civilization faces grave existential risks. [3]

Rees considers a scenario 20 years from now, where carbon dioxide levels have continued to rise, and where climate models have improved. There is a possibility the improved models will predict imminent catastrophe, in which case there might emerge political pressure to immediately deploy poorly-understood geo-engineering techniques. In order to mitigate this "nightmare" scenario, Rees advocates that we start exploring these techniques now, so as to better understand their limitations, risks, and side effects. [4]

Other risks include nuclear war, an asteroid strike, rogue biotechnology, or artificial intelligence. Rees advocates that nations empower supra-national institutions to better collaborate against such risks, a difficult task given the populist trends against globalism. [5] Some scholars, such as Stephen Hawking, have advocated space colonization as a way to mitigate existential risk; Rees breaks with Hawking on this matter and criticizes space colonization as "a dangerous delusion to think that space offers an escape from Earth's problems". [6]

Reception

Vanity Fair assessed the book as "uncontroversial... written in a way that's accessible to the general reader, and sprinkled with moments of infectious awe." [2] Scientific American recommended the book as a "spirited assessment of technology's role in shaping our future". [7] Engineering & Technology assessed it as "short, but persuasive". [8] A New Statesman review stated "Remarkably, what seems like an extended state-of-the-planet essay does not feel as depressing as it ought: Rees dispenses his apocalyptic overview of the coming decades like cocktail party wisdom". [5] The Financial Times called it "crisply written"; [3] Publishers Weekly called it "far-ranging but easily understood". [9] Kirkus Reviews called it "A book to be read by anyone on Earth who cares about its future." [10]

On November 23, 2018, the Israeli newspaper Yediot Aharonot reported that during a tense parliamentary debate in the Knesset, Prime Minister Netanyahu was observed engrossed in a copy of On The Future, marking passages and writing notes, and only occasionally raising his head to follow the ongoing debate in the plenum. [11]

Related Research Articles

<span class="mw-page-title-main">Transhumanism</span> Philosophical movement

Transhumanism is a philosophical and intellectual movement which advocates the enhancement of the human condition by developing and making widely available sophisticated technologies that can greatly enhance longevity and cognition.

<span class="mw-page-title-main">Max Tegmark</span> Swedish-American cosmologist

Max Erik Tegmark is a Swedish-American physicist, cosmologist and machine learning researcher. He is a professor at the Massachusetts Institute of Technology and the president of the Future of Life Institute. He is also a scientific director at the Foundational Questions Institute and a supporter of the effective altruism movement.

<span class="mw-page-title-main">Nick Bostrom</span> Swedish philosopher and author

Nick Bostrom is a Swedish-born philosopher at the University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, superintelligence risks, and the reversal test. In 2011, he founded the Oxford Martin Program on the Impacts of Future Technology, and is the founding director of the Future of Humanity Institute at Oxford University. In 2009 and 2015, he was included in Foreign Policy's Top 100 Global Thinkers list.

<span class="mw-page-title-main">Martin Rees</span> British cosmologist and astrophysicist

Martin John Rees, Baron Rees of Ludlow is a British cosmologist and astrophysicist. He is the fifteenth Astronomer Royal, appointed in 1995, and was Master of Trinity College, Cambridge, from 2004 to 2012 and President of the Royal Society between 2005 and 2010.

<span class="mw-page-title-main">AI takeover</span> Hypothetical artificial intelligence scenario

An AI takeover is a hypothetical scenario in which an artificial intelligence (AI) becomes the dominant form of intelligence on Earth, as computer programs or robots effectively take the control of the planet away from the human species. Possible scenarios include replacement of the entire human workforce, takeover by a superintelligent AI, and the popular notion of a robot uprising. Some public figures, such as Stephen Hawking and Elon Musk, have advocated research into precautionary measures to ensure future superintelligent machines remain under human control.

<span class="mw-page-title-main">Space and survival</span> Idea that long-term human presence requires to be spacefaring

Space and survival is the idea that the long-term survival of the human species and technological civilization requires the building of a spacefaring civilization that utilizes the resources of outer space, and that not doing this will lead to human extinction. A related observation is that the window of opportunity for doing this may be limited due to the decreasing amount of surplus resources that will be available over time as a result of an ever-growing population.

<span class="mw-page-title-main">Human extinction</span> Hypothetical end of the human species

Human extinction is the hypothetical end of the human species due to either natural causes such as population decline from sub-replacement fertility, an asteroid impact, large-scale volcanism, or to anthropogenic (human) causes.

<i>Our Final Hour</i> 2003 book by Martin Rees

Our Final Hour is a 2003 book by the British Astronomer Royal Sir Martin Rees. The full title of the book is Our Final Hour: A Scientist's Warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future In This Century—On Earth and Beyond. It was published in the United Kingdom under the title Our Final Century: Will the Human Race Survive the Twenty-first Century?.

<span class="mw-page-title-main">Future of Humanity Institute</span> Oxford interdisciplinary research centre

The Future of Humanity Institute (FHI) is an interdisciplinary research centre at the University of Oxford investigating big-picture questions about humanity and its prospects. It was founded in 2005 as part of the Faculty of Philosophy and the Oxford Martin School. Its director is philosopher Nick Bostrom, and its research staff and associates include futurist Anders Sandberg, engineer K. Eric Drexler, economist Robin Hanson, and Giving What We Can founder Toby Ord.

<span class="mw-page-title-main">Global catastrophic risk</span> Potentially harmful worldwide events

A global catastrophic risk or a doomsday scenario is a hypothetical future event that could damage human well-being and animal well being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an "existential risk."

The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price, Martin Rees and Jaan Tallinn.

<span class="mw-page-title-main">Starmus Festival</span> Astronomy, space exploration, music, art, and allied sciences international festival

The Starmus International Festival is an international gathering focused on celebrating astronomy, space exploration, music, art, and other sciences such as biology and chemistry. It was founded by Garik Israelian, an astronomer at the Institute for Astrophysics in Tenerife, Canary Islands, Spain.

<span class="mw-page-title-main">Future of Life Institute</span> International nonprofit research institute

The Future of Life Institute (FLI) is a nonprofit organization that works to reduce global catastrophic and existential risks facing humanity, particularly existential risk from advanced artificial intelligence (AI). The Institute's work is made up of three main strands: grantmaking for risk reduction, educational outreach, and advocacy within the United Nations, US government and European Union institutions. Its founders include MIT cosmologist Max Tegmark and Skype co-founder Jaan Tallinn, and its advisors include entrepreneur Elon Musk.

Existential risk from artificial general intelligence is the hypothesis that substantial progress in artificial general intelligence (AGI) could result in human extinction or some other unrecoverable global catastrophe. It is argued that the human species currently dominates other species because the human brain has some distinctive capabilities that other animals lack. If AI surpasses humanity in general intelligence and becomes "superintelligent", then it could become difficult or impossible for humans to control. Just as the fate of the mountain gorilla depends on human goodwill, so might the fate of humanity depend on the actions of a future machine superintelligence.

<i>Brief Answers to the Big Questions</i> 2018 popular science book by Stephen Hawking

Brief Answers to the Big Questions is a popular science book written by physicist Stephen Hawking, and published by Hodder & Stoughton (hardcover) and Bantam Books (paperback) on 16 October 2018. The book examines some of the universe's greatest mysteries, and promotes the view that science is very important in helping to solve problems on planet Earth. The publisher describes the book as "a selection of [Hawking's] most profound, accessible, and timely reflections from his personal archive", and is based on, according to a book reviewer, "half a million or so words" from his essays, lectures and keynote speeches.

The All-Party Parliamentary Group for Future Generations is an informal cross-party group with the parliament of the United Kingdom. Its role is to raise the profile of long-term issues, protect the interests of future generations, and provide a forum for discussion about ways to reduce short-termism in politics and policy making.

<i>The Precipice: Existential Risk and the Future of Humanity</i> 2020 book about existential risks by Toby Ord

The Precipice: Existential Risk and the Future of Humanity is a 2020 non-fiction book by the Australian philosopher Toby Ord, a senior research fellow at the Future of Humanity Institute in Oxford. It argues that humanity faces unprecedented risks over the next few centuries and examines the moral significance of safeguarding humanity's future.

<span class="mw-page-title-main">End Times (book)</span> 2019 book by Bryan Walsh

End Times: A Brief Guide to the End of the World is a 2019 non-fiction book by journalist Bryan Walsh. The book discusses various risks of human extinction, including asteroids, volcanoes, nuclear war, global warming, pathogens, biotech, AI, and extraterrestrial intelligence. The book includes interviews with astronomers, anthropologists, biologists, climatologists, geologists, and other scholars. The book advocates strongly for greater action.

<span class="mw-page-title-main">Longtermism</span> Philosophical view which prioritises the long-term future

Longtermism is an ethical stance which gives priority to improving the long-term future. It is an important concept in effective altruism and serves as a primary motivation for efforts that claim to reduce existential risks to humanity.

<span class="mw-page-title-main">Global catastrophe scenarios</span> Scenarios in which a global catastrophe creates harm

Scenarios in which a global catastrophic risk creates harm have been widely discussed. Some sources of catastrophic risk are anthropogenic, such as global warming, environmental degradation, engineered pandemics, and nuclear war. Others are non-anthropogenic or natural, such as meteor impacts or supervolcanoes. The impact of these scenarios can vary widely, depending on the cause and the severity of the event, ranging from temporary economic disruption to human extinction. Many societal collapses have already happened throughout human history.

References