Toby Ord

OK
About Toby Ord
Toby Ord is a philosopher at Oxford University, working on the big picture questions facing humanity. His earlier work explored the ethics of global poverty, leading him to make a lifelong pledge to donate 10% of his income to the most effective charities helping improve the world. He created a society, Giving What We Can, for people to join this mission, and together its members have pledged over $1.5 billion. He then broadened these ideas by co-founding the Effective Altruism movement in which thousands of people are using reason and evidence to help the lives of others as much as possible.
His current research is on risks that threaten human extinction or the permanent collapse of civilization, and on how to safeguard humanity through these dangers, which he considers to be among the most pressing and neglected issues we face. In his new book, The Precipice, he explains the risks we face, the stakes for humanity, and how we can find a path forward. Toby has advised the World Health Organization, the World Bank, the World Economic Forum, the US National Intelligence Council, and the UK Prime Minister's Office.
tobyord.com
theprecipice.com
givingwhatwecan.org
Customers Also Bought Items By
Author Updates
Books By Toby Ord
You Save: ₹ 194.95(39%)
What existential threats does humanity face? And how can we secure our future?
'The Precipice is a powerful book . . . Ord's love for humanity and hope for its future is infectious' Spectator
'Ord's analysis of the science is exemplary . . . Thrillingly written' Sunday Times
We live during the most important era of human history. In the twentieth century, we developed the means to destroy ourselves – without developing the moral framework to ensure we won't. This is the Precipice, and how we respond to it will be the most crucial decision of our time.
Oxford moral philosopher Toby Ord explores the risks to humanity's future, from the familiar man-made threats of climate change and nuclear war, to the potentially greater, more unfamiliar threats from engineered pandemics and advanced artificial intelligence.
With clear and rigorous thinking, Ord calculates the various risk levels, and shows how our own time fits within the larger story of human history. We can say with certainty that the novel coronavirus does not pose such a risk. But could the next pandemic? And what can we do, in our present moment, to face the risks head on?
A major work that brings together the disciplines of physics, biology, earth and computer science, history, anthropology, statistics, international relations, political science and moral philosophy, The Precipice is a call for a new understanding of our age: a major reorientation in the way we see the world, our history, and the role we play in it.
Very often we're uncertain about what we ought, morally, to do. We don't know how to weigh the interests of animals against humans, how strong our duties are to improve the lives of distant strangers, or how to think about the ethics of bringing new people into existence. But we still need to act. So how should we make decisions in the face of such uncertainty?
Though economists and philosophers have extensively studied the issue of decision-making in the face of uncertainty about matters of fact, the question of decision-making given fundamental moral uncertainty has been neglected. Philosophers William MacAskill, Krister Bykvist, and Toby Ord try to fill this gap. Moral Uncertainty argues that there are distinctive norms that govern how one ought to make decisions. It defends an information-sensitive account of how to make such decisions
by developing an analogy between moral uncertainty and social choice, arguing that the correct way to act in the face of moral uncertainty depends on whether the moral theories in which one has credence are merely ordinal, cardinal, or both cardinal and intertheoretically comparable. It tackles the problem of
how to make intertheoretical comparisons, discussing potential solutions and the implications of their view for metaethics and practical ethics.