Toby Ord | |
---|---|
Born | Toby David Godfrey Ord July 1979 (age 45) Melbourne, Australia |
Education | |
Spouse | Bernadette Young |
Era | Contemporary philosophy |
School | Western philosophy |
Institutions | |
Thesis | Beyond Action: Applying Consequentialism to Decision Making and Motivation (2009) |
Doctoral advisors | |
Main interests |
|
Notable ideas |
|
Website | www |
Toby David Godfrey Ord (born July 1979) [1] is an Australian philosopher. In 2009 he founded Giving What We Can, an international society whose members pledge to donate at least 10% of their income to effective charities, and is a key figure in the effective altruism movement, which promotes using reason and evidence to help the lives of others as much as possible. [2]
He was a senior research fellow at Oxford University's Future of Humanity Institute, where his work focused on existential risk. [3] His book on the subject, The Precipice: Existential Risk and the Future of Humanity , was published in March 2020. [4]
Ord was born in Melbourne, Australia, in 1979. [5] He later attended the University of Melbourne, where he initially studied computer science. On completing his first degree, he switched to studying philosophy to pursue his interest in ethics, later stating: "At this stage I knew that I wanted to make a large positive difference in the world and it seemed that studying ethics would help." [6]
For his graduate studies, Ord moved to the University of Oxford, where he obtained a B.Phil., and a D.Phil. in philosophy. Having submitted his doctoral thesis, Beyond Action: Applying Consequentialism to Decision Making and Motivation, Ord was retained as a junior research fellow by Balliol College, Oxford. [7]
Ord held the position of research fellow at Oxford's Future of Humanity Institute from 2014 until 2019, [7] and senior research fellow from 2019 until the institute's shutdown [8] in 2024. Ord describes his focus as "the big picture questions facing humanity." [9] He is a trustee of the Centre for Effective Altruism [10] and of the non-profit organization 80,000 Hours, researching careers that have the largest positive social impact and providing career advice based on that research. [11]
Ord's work has been primarily in moral philosophy. In applied ethics, he has worked on bioethics, the demands of morality, and global priority setting. He has also made contributions in global health, as an advisor to the third edition of Disease Control Priorities Project. [12] In normative ethics, his research has focused on consequentialism and on moral uncertainty.
Ord's current main research interest is existential risk. His book on the topic The Precipice: Existential Risk and the Future of Humanity was published in March 2020. [4] The New Yorker characterizes Ord's research motivation as follows: [5]
A concern for existential risk seemed, to Ord, to be the next logical expansion of a broadening moral circle. If we can learn to value the lives of people in other places and circumstances equally to our own, then we can do the same for people situated at a different moment in time. Those future people, whose quality of life and very existence will be intimately affected by our choices today, matter as much as we do.
Ord has written papers on the viability and potentials for hypercomputation, models of computation that can provide outputs that are not Turing-computable such as a machine that could solve the halting problem. [13]
At Oxford, Ord resolved to give a significant proportion of his income to the most cost-effective charities he could find. Following a number of enquiries from people interested in making a similar commitment, Ord decided to set up an organisation geared towards supporting like-minded donors. [14] In 2009, Ord launched Giving What We Can, an international society whose members have each pledged to donate at least 10% of their income to the most cost-effective charities. The organisation is aligned with, and part of, the effective altruism movement. Giving What We Can seeks not only to encourage people to give more of their money to charity but also stresses the importance of giving to the most cost-effective ones, [15] arguing that "you can often do 100x more good with your dollar by donating to the best charities." [16] [17] By July 2024, Giving What We Can had grown to over 9,000 members, who have already donated $253 million to effective charities. [18]
Ord himself decided initially to cap his income at £20,000 per year, and to give away everything he earned above that to well-researched charities. A year later, he revised this figure down to £18,000. [19] This threshold rises annually with inflation. [20] As of December 2019, he had donated £106,000, or 28 percent of his income. [21] Over the course of his career, he expects his donations to total around £1 million. [22]
Ord lives in Oxford with his wife, Bernadette Young, a medical doctor. [5] [23]
Nick Bostrom is a philosopher known for his work on existential risk, the anthropic principle, human enhancement ethics, whole brain emulation, superintelligence risks, and the reversal test. He was the founding director of the now dissolved Future of Humanity Institute at the University of Oxford and is now Principal Researcher at the Macrostrategy Research Initiative.
Human extinction is the hypothetical end of the human species, either by population decline due to extraneous natural causes, such as an asteroid impact or large-scale volcanism, or via anthropogenic destruction (self-extinction), for example by sub-replacement fertility.
Anders Sandberg is a Swedish researcher, futurist and transhumanist. He holds a PhD in computational neuroscience from Stockholm University, and is a former senior research fellow at the Future of Humanity Institute at the University of Oxford.
Differential technological development is a strategy of technology governance aiming to decrease risks from emerging technologies by influencing the sequence in which they are developed. On this strategy, societies would strive to delay the development of harmful technologies and their applications, while accelerating the development of beneficial technologies, especially those that offer protection against the harmful ones.
The Future of Humanity Institute (FHI) was an interdisciplinary research centre at the University of Oxford investigating big-picture questions about humanity and its prospects. It was founded in 2005 as part of the Faculty of Philosophy and the Oxford Martin School. Its director was philosopher Nick Bostrom, and its research staff included futurist Anders Sandberg and Giving What We Can founder Toby Ord.
Population ethics is the philosophical study of the ethical problems arising when our actions affect who is born and how many people are born in the future. An important area within population ethics is population axiology, which is "the study of the conditions under which one state of affairs is better than another, when the states of affairs in question may differ over the numbers and the identities of the persons who ever live."
A global catastrophic risk or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk".
Giving What We Can (GWWC) is an effective altruism-associated organisation whose members pledge to give at least 10% of their income to effective charities. It was founded at Oxford University in 2009 by the philosopher Toby Ord, physician-in-training Bernadette Young, and fellow philosopher William MacAskill.
Scope neglect or scope insensitivity is a cognitive bias that occurs when the valuation of a problem is not valued with a multiplicative relationship to its size. Scope neglect is a specific form of extension neglect.
Effective altruism (EA) is a 21st-century philosophical and social movement that advocates impartially calculating benefits and prioritizing causes to provide the greatest good. It is motivated by "using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis". People who pursue the goals of effective altruism, who are sometimes called effective altruists, follow a variety of approaches proposed by the movement, such as donating to selected charities and choosing careers with the aim of maximizing positive impact. The movement has achieved significant popularity outside of academia, spurring the creation of university-based institutes, research centers, advisory organizations and charities, which, collectively, have donated several hundreds of millions of dollars.
Earning to give involves deliberately pursuing a high-earning career for the purpose of donating a significant portion of earned income, typically because of a desire to do effective altruism. Advocates of earning to give contend that maximizing the amount one can donate to charity is an important consideration for individuals when deciding what career to pursue.
William David MacAskill is a Scottish philosopher and author, as well as one of the originators of the effective altruism movement. He was a Research Fellow at the Global Priorities Institute at the University of Oxford, co-founded Giving What We Can, the Centre for Effective Altruism and 80,000 Hours, and is the author of Doing Good Better (2015) and What We Owe the Future (2022), and the co-author of Moral Uncertainty (2020).
Existential risk from artificial general intelligence refers to the idea that substantial progress in artificial general intelligence (AGI) could lead to human extinction or an irreversible global catastrophe.
Doing Good Better: Effective Altruism and How You Can Make a Difference is a 2015 book by William MacAskill that serves as a primer on the effective altruism movement that seeks to do the most good. It is published by Random House and was released on July 28, 2015.
The Centre for Effective Altruism (CEA) is an Oxford-based organisation that builds and supports the effective altruism community. It was founded in 2012 by William MacAskill and Toby Ord, both philosophers at the University of Oxford. CEA is part of Effective Ventures, a federation of projects working to have a large positive impact in the world.
Astral Codex Ten, formerly called Slate Star Codex (SSC), is a blog focused on science, medicine, philosophy, politics, and futurism. The blog is written by Scott Alexander Siskind, a San Francisco Bay Area psychiatrist, under the pen name Scott Alexander.
The Precipice: Existential Risk and the Future of Humanity is a 2020 non-fiction book by the Australian philosopher Toby Ord, a senior research fellow at the Future of Humanity Institute in Oxford. It argues that humanity faces unprecedented risks over the next few centuries and examines the moral significance of safeguarding humanity's future.
Longtermism is the ethical view that positively influencing the long-term future is a key moral priority of our time. It is an important concept in effective altruism and a primary motivation for efforts that aim to reduce existential risks to humanity.
What We Owe the Future is a 2022 book by the Scottish philosopher and ethicist William MacAskill, an associate professor in philosophy at the University of Oxford. It advocates for effective altruism and the philosophy of longtermism, which MacAskill defines as "the idea that positively influencing the long-term future is a key moral priority of our time." His argument is based on the premises that future people count, there could be many of them, and we can make their lives better.
Existential risk studies (ERS) is a field of studies focused on the definition and theorization of "existential risks", its ethical implications and the related strategies of long-term survival. Existential risks are diversely defined as global kinds of calamity that have the capacity of inducing the extinction of intelligent earthling life, such as humans, or, at least, a severe limitation of their potential, as defined by ERS theorists. The field development and expansion can be divided in waves according to its conceptual changes as well as its evolving relationship with related fields and theories, such as futures studies, disaster studies, AI safety, effective altruism and longtermism.