Tag: Philip Tetlock

This Month at Cato Unbound—What’s Wrong with Expert Predictions

This month’s Cato Unbound looks at the failure of expert forecasting.

When I was very young my father received a book of expert predictions edited by David Wallechinsky, Amy Wallace, and Irving Wallace, titled simply The Book of Predictions. How’d they do? Awfully.

Virtually no one predicted the peaceful end of the Soviet empire. The next big technology was still outer space, not information. Nuclear war and overpopulation vied with exotic environmental disasters to do us in. Want to print a document? Your computer can do that! Just walk to the end of your street, where you’ll find a device called a “printer.” I’ve kept the book, and I’ve been interested in the failure of expert prediction ever since.

This month at Cato Unbound, experts—sorry, we had to—Dan Gardner and Philip Tetlock lay out the evidence against forecasting, along with suggestions for how to improve it. But they conclude that many forms of forecasting, even those that once seemed just on the horizon, will perhaps always remain a dream:

Natural science has discovered in the past half-century that the dream of ever-growing predictive mastery of a deterministic universe may well be just that, a dream. There increasingly appear to be fundamental limits to what we can ever hope to predict. Take the earthquake in Japan. Once upon a time, scientists were confident that as their understanding of geology advanced, so would their ability to predict such disasters. No longer. As with so many natural phenomena, earthquakes are the product of what scientists call “complex systems,” or systems which are more than the sum of their parts. Complex systems are often stable not because there is nothing going on within them but because they contain many dynamic forces pushing against each other in just the right combination to keep everything in place. The stability produced by these interlocking forces can often withstand shocks but even a tiny change in some internal conditional at just the right spot and just the right moment can throw off the internal forces just enough to destabilize the system—and the ground beneath our feet that has been so stable for so long suddenly buckles and heaves in the violent spasm we call an earthquake. Barring new insights that shatter existing paradigms, it will forever be impossible to make time-and-place predictions in such complex systems. The best we can hope to do is get a sense of the probabilities involved. And even that is a tall order.

Human systems like economies are complex systems, with all that entails. And bear in mind that human systems are not made of sand, rock, snowflakes, and the other stuff that behaves so unpredictably in natural systems. They’re made of people: self-aware beings who see, think, talk, and attempt to predict each other’s behavior—and who are continually adapting to each other’s efforts to predict each other’s behavior, adding layer after layer of new calculations and new complexity. All this adds new barriers to accurate prediction.