It is the somewhat gratifying lesson of Philip Tetlock's new book . . . that
people who make prediction their business--people who appear as experts on
television, get quoted in newspaper articles, advise governments and
businesses, and participate in punditry roundtables--are no better than the
rest of us. When they're wrong, they're rarely held accountable, and they
rarely admit it, either. . . . It would be nice if there were fewer partisans
on television disguised as analysts and experts. . . . But the best lesson of
Tetlock's book may be the one that he seems most reluctant to draw: Think for
yourself.--Louis Menand, The New Yorker
"From one of the world's most highly regarded social scientists, a transformative book on the habits of mind that lead to the best predictions Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week's meals. Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts' predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught? In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people--including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer--who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They've beaten other benchmarks, competitors, and prediction markets. They've even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters." In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden's compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn't require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future--whether in business, finance, politics, international affairs, or daily life--and is destined to become a modern classic"-- Provided by publisher
Tetlock's latest project - an unprecedented, government-funded forecasting tournament involving over a million individual predictions - has since shown that there are, however, some people with real, demonstrable foresight.
Everyone would benefit from better foresight, whether in finance, policy-making, or daily planning. However, people often struggle with accurate predictions. A landmark study by Wharton professor Philip Tetlock revealed that even experts' forecasts are only marginally better than chance. Yet, some experts possess genuine foresight, prompting Tetlock to explore the reasons behind this talent and whether it can be taught. In collaboration with Dan Gardner, Tetlock presents a comprehensive examination of prediction, drawing on extensive research and a large-scale forecasting tournament called the Good Judgment Project. This initiative involved thousands of ordinary individuals, including a filmmaker and a retired pipe installer, who successfully forecasted global events, outperforming benchmarks and even intelligence analysts with classified information. These exceptional forecasters, termed 'superforecasters,' demonstrate that effective prediction doesn't rely on advanced technology but on gathering diverse evidence, thinking probabilistically, collaborating, tracking performance, and embracing errors. Through compelling stories of both successful and failed forecasts, along with insights from prominent decision-makers, the authors reveal practical strategies for enhancing our predictive abilities across various domains, making this work a valuable resource for anyone seeking to improve their foresight.