Philip Tetlock

OK
About Philip Tetlock
Philip E. Tetlock (born 1954) is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sciences.
He has written several non-fiction books at the intersection of psychology, political science and organizational behavior, including Superforecasting: The Art and Science of Prediction; Expert Political Judgment: How Good Is It? How Can We Know?; Unmaking the West: What-if Scenarios that Rewrite World History; and Counterfactual Thought Experiments in World Politics. Tetlock is also co-principal investigator of The Good Judgment Project, a multi-year study of the feasibility of improving the accuracy of probability judgments of high-stakes, real-world events.
For more see here: https://en.wikipedia.org/wiki/Philip_E._Tetlock
For CV: https://www.dropbox.com/s/uorzufg1v0nhcii/Tetlock%20CV%20%20march%2018%2C%202016.docx?dl=0
Twitter: https://twitter.com/PTetlock
LinkedIn: https://www.linkedin.com/in/philip-tetlock-64aa108a?trk=hp-identity-name
For an interview: https://www.edge.org/conversation/philip_tetlock-how-to-win-at-forecasting
Customers Also Bought Items By
Author Updates
Books By Philip Tetlock
You Save: ₹ 122.79(31%)
The international bestseller
'A manual for thinking clearly in an uncertain world. Read it.' Daniel Kahneman, author of Thinking, Fast and Slow
_________________________
What if we could improve our ability to predict the future?
Everything we do involves forecasts about how the future will unfold. Whether buying a new house or changing job, designing a new product or getting married, our decisions are governed by implicit predictions of how things are likely to turn out. The problem is, we're not very good at it.
In a landmark, twenty-year study, Wharton professor Philip Tetlock showed that the average expert was only slightly better at predicting the future than a layperson using random guesswork. Tetlock's latest project – an unprecedented, government-funded forecasting tournament involving over a million individual predictions – has since shown that there are, however, some people with real, demonstrable foresight. These are ordinary people, from former ballroom dancers to retired computer programmers, who have an extraordinary ability to predict the future with a degree of accuracy 60% greater than average. They are superforecasters.
In Superforecasting, Tetlock and his co-author Dan Gardner offer a fascinating insight into what we can learn from this elite group. They show the methods used by these superforecasters which enable them to outperform even professional intelligence analysts with access to classified data. And they offer practical advice on how we can all use these methods for our own benefit – whether in business, in international affairs, or in everyday life.
_________________________
'The techniques and habits of mind set out in this book are a gift to anyone who has to think about what the future might bring. In other words, to everyone.' Economist
'A terrific piece of work that deserves to be widely read . . . Highly recommended.' Independent
'The best thing I have read on predictions . . . Superforecasting is an indispensable guide to this indispensable activity.' The Times
You Save: ₹ 886.68(27%)
Since its original publication, Expert Political Judgment by New York Times bestselling author Philip Tetlock has established itself as a contemporary classic in the literature on evaluating expert opinion.
Tetlock first discusses arguments about whether the world is too complex for people to find the tools to understand political phenomena, let alone predict the future. He evaluates predictions from experts in different fields, comparing them to predictions by well-informed laity or those based on simple extrapolation from current trends. He goes on to analyze which styles of thinking are more successful in forecasting. Classifying thinking styles using Isaiah Berlin's prototypes of the fox and the hedgehog, Tetlock contends that the fox--the thinker who knows many little things, draws from an eclectic array of traditions, and is better able to improvise in response to changing events--is more successful in predicting the future than the hedgehog, who knows one big thing, toils devotedly within one tradition, and imposes formulaic solutions on ill-defined problems. He notes a perversely inverse relationship between the best scientific indicators of good judgement and the qualities that the media most prizes in pundits--the single-minded determination required to prevail in ideological combat.
Clearly written and impeccably researched, the book fills a huge void in the literature on evaluating expert opinion. It will appeal across many academic disciplines as well as to corporations seeking to develop standards for judging expert decision-making. Now with a new preface in which Tetlock discusses the latest research in the field, the book explores what constitutes good judgment in predicting future events and looks at why experts are often wrong in their forecasts.
Political scientists often ask themselves what might have been if history had unfolded differently: if Stalin had been ousted as General Party Secretary or if the United States had not dropped the bomb on Japan. Although scholars sometimes scoff at applying hypothetical reasoning to world politics, the contributors to this volume--including James Fearon, Richard Lebow, Margaret Levi, Bruce Russett, and Barry Weingast--find such counterfactual conjectures not only useful, but necessary for drawing causal inferences from historical data. Given the importance of counterfactuals, it is perhaps surprising that we lack standards for evaluating them. To fill this gap, Philip Tetlock and Aaron Belkin propose a set of criteria for distinguishing plausible from implausible counterfactual conjectures across a wide range of applications.
The contributors to this volume make use of these and other criteria to evaluate counterfactuals that emerge in diverse methodological contexts including comparative case studies, game theory, and statistical analysis. Taken together, these essays go a long way toward establishing a more nuanced and rigorous framework for assessing counterfactual arguments about world politics in particular and about the social sciences more broadly.