|Digital List Price:||392.96|
|Kindle Price:|| 270.17 |
Save 228.83 (46%)
|Sold by:||Amazon Asia-Pacific Holdings Private Limited|
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required. Learn more
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera, scan the code below and download the Kindle app.
Superforecasting: The Art and Science of Prediction Kindle Edition
Audible Audiobook, Unabridged
|Free with your Audible trial|
MP3 CD, Audiobook, MP3 Audio, Unabridged
The international bestseller
'A manual for thinking clearly in an uncertain world. Read it.' Daniel Kahneman, author of Thinking, Fast and Slow
What if we could improve our ability to predict the future?
Everything we do involves forecasts about how the future will unfold. Whether buying a new house or changing job, designing a new product or getting married, our decisions are governed by implicit predictions of how things are likely to turn out. The problem is, we're not very good at it.
In a landmark, twenty-year study, Wharton professor Philip Tetlock showed that the average expert was only slightly better at predicting the future than a layperson using random guesswork. Tetlock's latest project – an unprecedented, government-funded forecasting tournament involving over a million individual predictions – has since shown that there are, however, some people with real, demonstrable foresight. These are ordinary people, from former ballroom dancers to retired computer programmers, who have an extraordinary ability to predict the future with a degree of accuracy 60% greater than average. They are superforecasters.
In Superforecasting, Tetlock and his co-author Dan Gardner offer a fascinating insight into what we can learn from this elite group. They show the methods used by these superforecasters which enable them to outperform even professional intelligence analysts with access to classified data. And they offer practical advice on how we can all use these methods for our own benefit – whether in business, in international affairs, or in everyday life.
'The techniques and habits of mind set out in this book are a gift to anyone who has to think about what the future might bring. In other words, to everyone.' Economist
'A terrific piece of work that deserves to be widely read . . . Highly recommended.' Independent
'The best thing I have read on predictions . . . Superforecasting is an indispensable guide to this indispensable activity.' The Times
A terrific piece of work that deserves to be widely read . . . Highly recommended. ― Independent
This marvelous book tells an exciting story of ordinary people beating experts in a very serious game. It is also a manual for thinking clearly in an uncertain world. Read it. -- Daniel Kahneman
Full of excellent advice – it is the best thing I have read on predictions . . . Superforecasting is an indispensable guide to this indispensable activity. ― The Times
Philip Tetlock has transformed the science of prediction. ― Spectator
The most important book on decision making since Daniel Kahneman's Thinking, Fast and Slow. ― Wall Street Journal
Fascinating and breezily written. ― Sunday Times
Superforecasting is a fascinating book. ― Daily Mail
Superforecasting is a very good book. In fact it is essential reading. ― Management Today
The best way to know if an idea is right is to see if it predicts the future. But which ideas, which methods, which people have a track record of non-obvious predictions vindicated by the course of events? The answers will surprise you, and they have radical implications for politics, policy, journalism, education, and even epistemology – how we can best gain knowledge about the world. The casual style of Superforecasting belies the profundity of its message. -- Steven Pinker
Superforecasting is a rare book that will make you smarter and wiser. One of the giants of behavioral science reveals how to improve at predicting the future. -- Adam Grant
The material in Superforecasting is new, and includes a compendium of best practices for prediction . . . [It offers] us all an opportunity to understand and react more intelligently to the confusing world around us. ― New York Times Book Review
Tetlock's 'Ten Commandments For Aspiring Superforecasters' should probably have a place of honor in most business meeting rooms. ― Forbes
There isn't a social scientist in the world I admire more than Phil Tetlock. -- Tim Harford
Superforecasting is the most important scientific study I’ve ever read on prediction. ― Bloomberg View
A fascinating study of what it is that makes some forecasters consistently better than others. ― International Politico
Tetlock's work is fascinating and important, and he and Gardner have written it up with verve. - Financial Times -- Stephen Cave
Superforecasting by Philip Tetlock and Dan Gardner, is one of the most interesting business and finance books published in 2015. -- John Kay ― Financial Times
The lessons of superforecasting are keenly relevant to huge swathes of our lives. -- Matthew Syed ― The Times
Tetlock writes boldly about wanting to improve what he sees as the bloated, expensive – and not terribly accurate – intelligence apparatus that advises our politicians and drives global affairs. ― City A.M.
Philip Tetlock’s Superforecasting is a common-sense guide to thinking about decision-making and the future by a man who knows this terrain like no one else. -- Books of the Year ― Bloomberg Business
Tetlock and Gardner believe anyone can improve their forecasting ability by learning from the way they work. If that's true, people in business and finance who make an effort to do so have a lot to gain – and those who don't, much to lose. ― Financial Post
What I found most interesting was the continuous process of integrating new information to test and modify existing beliefs … clearly a beneficial skill in financial markets ― Citywire
Social science has enormous potential, especially when it combines 'rigorous empiricism with a resistance to absolute answers.' The work of Philip Tetlock possesses these qualities. ― Scientific American
A fascinating book. ― PR Week
Offers a valuable insight into the future of management. -- CMI Management Book of the Year judges
Both rigorous and readable. The lessons are directly relevant to business, finance, government, and politics. -- Books of the Year ― Bloomberg Business
A scientific analysis of the ancient art of divination which shows that forecasting is a talent. -- Books of the Year ― Economist
Captivating . . . [Tetlock's] writing is so engaging and his argument so tantalizing, readers will quickly be drawn into the challenge . . . A must-read field guide for the intellectually curious. ― Kirkus Reviews
A top choice [for best book of 2015] among the world’s biggest names in finance and economics . . . Eurasia Group founder Ian Bremmer, Deutsche Bank Chief U.S. Economist Joe LaVorgna, and Citigroup Vice Chairman Peter Orszag were among those giving it a thumbs-up. ― Bloomberg Businessweek
Just as modern medicine began when a farsighted few began to collect data and keep track of outcomes, to trust objective 'scoring' over their own intuitions, it's time now for similar demands to be made of the experts who lead public opinion. It's time for evidence-based forecasting. ― Washington Post
Tetlock and his colleagues [have] found that there is such a thing as foresight, and it’s not a gift that’s bestowed upon special people, but is a skill that can be learned and developed . . . To obtain this apparent superpower does not take a PhD or an exceptionally high IQ; it takes a certain mindset. ― Guardian
Superforecasting is a very good book. In fact it is essential reading - which I have never said in any of my previous Management Today reviews . . . It should be on every manager's and investor's reading list around the topics du jour of decision-making, prediction and behavioural economics. -- Andrew Wileman ― Management Today
Read Philip Tetlock’s Superforecasting, instead of political pundits who don’t what they’re talking about. -- Dominic Cummings
We should indeed apply superforecasting more systematically to government. Like systematic opinion polling, it is an aid to decision-makers and informed debate. It is ideologically neutral, unless you have a bias in favour of ignorance. This is all good. -- Andrew Adonis ― Independent --This text refers to an alternate kindle_edition edition.
Excerpt. © Reprinted by permission. All rights reserved.
An Optimistic Skeptic
We are all forecasters. When we think about changing jobs, getting married, buying a home, making an investment, launching a product, or retiring, we decide based on how we expect the future will unfold. These expectations are forecasts. Often we do our own forecasting. But when big events happen--markets crash, wars loom, leaders tremble--we turn to the experts, those in the know. We look to people like Tom Friedman.
If you are a White House staffer, you might find him in the Oval Office with the president of the United States, talking about the Middle East. If you are a Fortune 500 CEO, you might spot him in Davos, chatting in the lounge with hedge fund billionaires and Saudi princes. And if you don’t frequent the White House or swanky Swiss hotels, you can read his New York Times columns and bestselling books that tell you what’s happening now, why, and what will come next.1 Millions do.
Like Tom Friedman, Bill Flack forecasts global events. But there is a lot less demand for his insights.
For years, Bill worked for the US Department of Agriculture in Arizona--“part pick-and-shovel work, part spreadsheet”--but now he lives in Kearney, Nebraska. Bill is a native Cornhusker. He grew up in Madison, Nebraska, a farm town where his parents owned and published the Madison Star-Mail, a newspaper with lots of stories about local sports and county fairs. He was a good student in high school and he went on to get a bachelor of science degree from the University of Nebraska. From there, he went to the University of Arizona. He was aiming for a PhD in math, but he realized it was beyond his abilities--“I had my nose rubbed in my limitations” is how he puts it--and he dropped out. It wasn’t wasted time, however. Classes in ornithology made Bill an avid bird-watcher, and because Arizona is a great place to see birds, he did fieldwork part-time for scientists, then got a job with the Department of Agriculture and stayed for a while.
Bill is fifty-five and retired, although he says if someone offered him a job he would consider it. So he has free time. And he spends some of it forecasting.
Bill has answered roughly three hundred questions like “Will Russia officially annex additional Ukrainian territory in the next three months?” and “In the next year, will any country withdraw from the eurozone?” They are questions that matter. And they’re difficult. Corporations, banks, embassies, and intelligence agencies struggle to answer such questions all the time. “Will North Korea detonate a nuclear device before the end of this year?” “How many additional countries will report cases of the Ebola virus in the next eight months?” “Will India or Brazil become a permanent member of the UN Security Council in the next two years?” Some of the questions are downright obscure, at least for most of us. “Will NATO invite new countries to join the Membership Action Plan (MAP) in the next nine months?” “Will the Kurdistan Regional Government hold a referendum on national independence this year?” “If a non-Chinese telecommunications firm wins a contract to provide Internet services in the Shanghai Free Trade Zone in the next two years, will Chinese citizens have access to Facebook and/or Twitter?” When Bill first sees one of these questions, he may have no clue how to answer it. “What on earth is the Shanghai Free Trade Zone?” he may think. But he does his homework. He gathers facts, balances clashing arguments, and settles on an answer.
No one bases decisions on Bill Flack’s forecasts, or asks Bill to share his thoughts on CNN. He has never been invited to Davos to sit on a panel with Tom Friedman. And that’s unfortunate. Because Bill Flack is a remarkable forecaster. We know that because each one of Bill’s predictions has been dated, recorded, and assessed for accuracy by independent scientific observers. His track record is excellent.
Bill is not alone. There are thousands of others answering the same questions. All are volunteers. Most aren’t as good as Bill, but about 2% are. They include engineers and lawyers, artists and scientists, Wall Streeters and Main Streeters, professors and students. We will meet many of them, including a mathematician, a filmmaker, and some retirees eager to share their underused talents. I call them superforecasters because that is what they are. Reliable evidence proves it. Explaining why they’re so good, and how others can learn to do what they do, is my goal in this book.
How our low-profile superforecasters compare with cerebral celebrities like Tom Friedman is an intriguing question, but it can’t be answered because the accuracy of Friedman’s forecasting has never been rigorously tested. Of course Friedman’s fans and critics have opinions one way or the other--“he nailed the Arab Spring” or “he screwed up on the 2003 invasion of Iraq” or “he was prescient on NATO expansion.” But there are no hard facts about Tom Friedman’s track record, just endless opinions--and opinions on opinions.2 And that is business as usual. Every day, the news media deliver forecasts without reporting, or even asking, how good the forecasters who made the forecasts really are. Every day, corporations and governments pay for forecasts that may be prescient or worthless or something in between. And every day, all of us--leaders of nations, corporate executives, investors, and voters--make critical decisions on the basis of forecasts whose quality is unknown. Baseball managers wouldn’t dream of getting out the checkbook to hire a player without consulting performance statistics. Even fans expect to see player stats on scoreboards and TV screens. And yet when it comes to the forecasters who help us make decisions that matter far more than any baseball game, we’re content to be ignorant.3
In that light, relying on Bill Flack’s forecasts looks quite reasonable. Indeed, relying on the forecasts of many readers of this book may prove quite reasonable, for it turns out that forecasting is not a “you have it or you don’t” talent. It is a skill that can be cultivated. This book will show you how.
The One About the Chimp
I want to spoil the joke, so I’ll give away the punch line: the average expert was roughly as accurate as a dart-throwing chimpanzee.
You’ve probably heard that one before. It’s famous--in some circles, infamous. It has popped up in the New York Times, the Wall Street Journal, the Financial Times, the Economist, and other outlets around the world. It goes like this: A researcher gathered a big group of experts--academics, pundits, and the like--to make thousands of predictions about the economy, stocks, elections, wars, and other issues of the day. Time passed, and when the researcher checked the accuracy of the predictions, he found that the average expert did about as well as random guessing. Except that’s not the punch line because “random guessing” isn’t funny. The punch line is about a dart-throwing chimpanzee. Because chimpanzees are funny.
I am that researcher and for a while I didn’t mind the joke. My study was the most comprehensive assessment of expert judgment in the scientific literature. It was a long slog that took about twenty years, from 1984 to 2004, and the results were far richer and more constructive than the punch line suggested. But I didn’t mind the joke because it raised awareness of my research (and, yes, scientists savor their fifteen minutes of fame too). And I myself had used the old “dart-throwing chimp” metaphor, so I couldn’t complain too loudly.
I also didn’t mind because the joke makes a valid point. Open any newspaper, watch any TV news show, and you find experts who forecast what’s coming. Some are cautious. More are bold and confident. A handful claim to be Olympian visionaries able to see decades into the future. With few exceptions, they are not in front of the cameras because they possess any proven skill at forecasting. Accuracy is seldom even mentioned. Old forecasts are like old news--soon forgotten--and pundits are almost never asked to reconcile what they said with what actually happened. The one undeniable talent that talking heads have is their skill at telling a compelling story with conviction, and that is enough. Many have become wealthy peddling forecasting of untested value to corporate executives, government officials, and ordinary people who would never think of swallowing medicine of unknown efficacy and safety but who routinely pay for forecasts that are as dubious as elixirs sold from the back of a wagon. These people--and their customers--deserve a nudge in the ribs. I was happy to see my research used to give it to them.
But I realized that as word of my work spread, its apparent meaning was mutating. What my research had shown was that the average expert had done little better than guessing on many of the political and economic questions I had posed. “Many” does not equal all. It was easiest to beat chance on the shortest-range questions that only required looking one year out, and accuracy fell off the further out experts tried to forecast--approaching the dart-throwing-chimpanzee level three to five years out. That was an important finding. It tells us something about the limits of expertise in a complex world--and the limits on what it might be possible for even superforecasters to achieve. But as in the children’s game of “telephone,” in which a phrase is whispered to one child who passes it on to another, and so on, and everyone is shocked at the end to discover how much it has changed, the actual message was garbled in the constant retelling and the subtleties were lost entirely. The message became “all expert forecasts are useless,” which is nonsense. Some variations were even cruder--like “experts know no more than chimpanzees.” My research had become a backstop reference for nihilists who see the future as inherently unpredictable and know-nothing populists who insist on preceding “expert” with “so-called.”
So I tired of the joke. My research did not support these more extreme conclusions, nor did I feel any affinity for them. Today, that is all the more true.
There is plenty of room to stake out reasonable positions between the debunkers and the defenders of experts and their forecasts. On the one hand, the debunkers have a point. There are shady peddlers of questionable insights in the forecasting marketplace. There are also limits to foresight that may just not be surmountable. Our desire to reach into the future will always exceed our grasp. But debunkers go too far when they dismiss all forecasting as a fool’s errand. I believe it is possible to see into the future, at least in some situations and to some extent, and that any intelligent, open-minded, and hardworking person can cultivate the requisite skills.
Call me an “optimistic skeptic.”
To understand the “skeptic” half of that label, consider a young Tunisian man pushing a wooden handcart loaded with fruits and vegetables down a dusty road to a market in the Tunisian town of Sidi Bouzid. When the man was three, his father died. He supports his family by borrowing money to fill his cart, hoping to earn enough selling the produce to pay off the debt and have a little left over. It’s the same grind every day. But this morning, the police approach the man and say they’re going to take his scales because he has violated some regulation. He knows it’s a lie. They’re shaking him down. But he has no money. A policewoman slaps him and insults his dead father. They take his scales and his cart. The man goes to a town office to complain. He is told the official is busy in a meeting. Humiliated, furious, powerless, the man leaves.
1. Why single out Tom Friedman when so many other celebrity pundits could have served the purpose? The choice was driven by a simple formula: (status of pundit) X (difficulty of pinning down his/her forecasts) X (relevance of pundit’s work to world politics). Highest score wins. Friedman has high status; his claims about possible futures are highly difficult to pin down--and his work is highly relevant to geopolitical forecasting. The choice of Friedman was in no way driven by an aversion to his editorial opinions. Indeed, I reveal in the last chapter a sneaky admiration for some aspects of his work. Exasperatingly evasive though Friedman can be as a forecaster, he proves to be a fabulous source of forecasting questions.
2. Again, this is not to imply that Friedman is unusual in this regard. Virtually every political pundit on the planet operates under the same tacit ground rules. They make countless claims about what lies ahead but couch their claims in such vague verbiage that it is impossible to test them. How should we interpret intriguing claims like “expansion of NATO could trigger a ferocious response from the Russian bear and may even lead to a new Cold War” or “the Arab Spring might signal that the days of unaccountable autocracy in the Arab world are numbered” or . . . ? The key terms in these semantic dances, may or could or might, are not accompanied by guidance on how to interpret them. Could could mean anything from a 0.0000001 chance of “a large asteroid striking our planet in the next one hundred years” to a 0.7 chance of “Hillary Clinton winning the presidency in 2016.” All this makes it impossible to track accuracy across time and questions. It also gives pundits endless flexibility to claim credit when something happens (I told you it could) and to dodge blame when it does not (I merely said it could happen). We shall encounter many examples of such linguistic mischief.
3. It is as though we have collectively concluded that sizing up the starting lineup for the Yankees deserves greater care than sizing up the risk of genocide in the South Sudan. Of course the analogy between baseball and politics is imperfect. Baseball is played over and over under standard conditions. Politics is a quirky game in which the rules are continually being contorted and contested. So scoring political forecasting is much harder than compiling baseball statistics. But “harder” doesn’t mean impossible. It turns out to be quite possible.
There is also another objection to the analogy. Pundits do more than forecasting. They put events in historical perspective, offer explanations, engage in policy advocacy, and pose provocative questions. All true, but pundits also make lots of implicit or explicit forecasts. For instance, the historical analogies pundits invoke contain implicit forecasts: the Munich appeasement analogy is trotted out to support the conditional forecast “if you appease country X, it will ramp up its demands”; and the World War I analogy is trotted out to support “if you use threats, you will escalate the conflict.” I submit that it is logically impossible to engage in policy advocacy (which pundits routinely do) without making assumptions about whether we would be better or worse off if we went down one or another policy path. Show me a pundit who does not make at least implicit forecasts and I will show you one who has faded into Zen-like irrelevance. --This text refers to an alternate kindle_edition edition.
- ASIN : B00Y78X7HY
- Publisher : Cornerstone Digital; 1st edition (24 September 2015)
- Language : English
- File size : 1521 KB
- Text-to-Speech : Enabled
- Screen Reader : Supported
- Enhanced typesetting : Enabled
- X-Ray : Enabled
- Word Wise : Enabled
- Print length : 354 pages
- Best Sellers Rank: #27,080 in Kindle Store (See Top 100 in Kindle Store)
- #336 in Society & Culture (Kindle Store)
- #1,359 in Society & Culture (Books)
- Customer Reviews:
About the authors
Reviewed in India on 21 January 2020
Reviews with images
Top reviews from India
There was a problem filtering reviews right now. Please try again later.
Need this tool to understand
The book is on the topic of how do you correctly assign the probability on the events that are very uncertain. Some examples given in the book - Will they find traces of polonium poisoning in the exhumed body of Yasar Arafat. Or Will there be another terrorist attack in Europe in the next quarter. Who will win the election. You get the idea...of a kind of uncertainty we are dealing with all these outcomes. The book articulates in great detail the methodology used by few superforecasters, the ordinary folks, to arrive at correct probability of such very uncertain events. These people were all part of Good Judgement project initiated by author and his wife and funded by NIA or something. It ran for 4 years and these superforecasters beat CIA analysts (who had real time field intel) by 30% or so. They used internet, google alerts, Baysian probability etc tools to gather and process the data to arrive at their prediction. Many of these superforecaster have background in computer science, Math, technology and engineering. But the methods can be applied by any graduates. They are not beyond the capabilities of any graduate. It just calls for very structured analytical analysis of data and not basing your decisions on just an intuition.
The book also tells us that a team of superforecasters is even more accurate than single superforecaster...provided they freely share the data with each other, independently research and process the data, share their opinion without fear and overall treat each other with respect.
The decision making methodology and concept posited in this book has serious practical implications for certain groups of people like for example....stock investors, top executives, business development managers, opinion makers, forecasters etc.
The book is written by author with help from Gardner who is a reporter and professional writer. The book is greatly enhanced by this partnership. The book is very easy and interesting read due to Gardner's contribution.
I have no hesitation in recommending this splendid book to all who deal with uncertainty in their professional life. Five star Book.
The subject becomes interesting from chapter 4 "Super forecasters" and then it really picks up pace. I am only halfway through the book and have started liking the approach. Only shortcoming being the small letters and the print not very clear.
for everyone to improve their decision making skills using the tools mentioned in the book. These tools can be added to our framework of "mental models". Some key takeaways:
o Have an "inside view/ outside view" approach to decision making.
o Know what’s the base rate and avoid "base rate fallacy"
o Have a probabilistic way of thinking.
o Beliefs are hypothesis that needed to be tested, not treasures that need to be protected.
o Take ‘wisdom of the crowd’ and consider different viewpoints like ‘dragonfly-eyed’.
o Being aware of cognitive and emotional bias.
Verdict: Highly recommended!
The Ten Commandments at the end are the best part of the book. The notes sections has some quite valuable resources too.
This is not to say it’s a boring book of sorts, it’s just that it tells way more stories rather than exploring the core idea of how to actually work on forecasting.
Reviewed in India 🇮🇳 on 21 January 2020
The Ten Commandments at the end are the best part of the book. The notes sections has some quite valuable resources too.
This is not to say it’s a boring book of sorts, it’s just that it tells way more stories rather than exploring the core idea of how to actually work on forecasting.
Top reviews from other countries
The issue with the book is not the material of the content but the padding. There seems to be a lot of it. This is a 300+ page book that can be edited down to half the size without losing information. Many of the same examples of Superforecasting were repeated more than once.
It was funny to read that a lot of businesses are not actually that interested if a forecast is right or wrong provided the forecast tells them what they want to hear. Talking from experience I know this to be true. In addition other forecasters are reluctant to revisit old forecasts in fear of exposing their inaccuracies, which to me, made zero sense and I am glad Tetlock agrees with this view.
Overall it is a good read, just nothing special if you do this sort of thing for a living.
Certainly in this time of COVID-19, after reading this book you'll start noticing a lot of public figures fall into basic data interpretation mistakes, make predictions that turn out to be totally wrong, and then continue as normal anyway!
One criticism I have is that I would've liked it to better slightly less "popular" science; include a bit more hard data, remove a little of the padding. However even with this criticism, there was much for me to learn. And it did include substantial references to evidence.
Prediction is an extremely important component to testing whether your hypotheses are correct. Therefore, knowing about prediction is a key issue in science. Anyone who cares a lot about science should read a book like this or something similar. For any such person, I would gladly recommend this book.