8 min read

Super Forecasting by Philip E. Tetlock and Dan Gardner - Book Summary and Notes

Can normal people make accurate predictions about the future? Yes, with a lot of work, careful application of principles, and as long as you don't look too far into the future.

Review

Tetlok is best known for his research showing the average expert to be roughly as accurate as a dart-throwing chimpanzee. But Tetlok is actually optimistic about the ability to forecast. While most failed, some of the experts were able to beat chance--the superforecasters.

Through his research and work on the Good Judgment Project, Tetlok has studied what it takes to be a superforecaster. He's distilled a set of behaviors and mindsets that superforecasters use. The result is this book, showing how superforecasters make their decisions, where they make mistakes, and how you can apply the same techniques.

The book can get a little repetitive at times, and could have been shorter. Despite that, it easy to read while still being informative and highly actionable. Strong recommendation.

I recommend this book for:

  1. Anyone interested in improving their own forecasts
  2. Recent readers of Nassim Taleb who are convinced all forecasting is BS
Book Cover

The Ten Commandments of Superforecasting

  1. Focus your time and effort on forecasts that will prove rewarding.
  2. Unpack problems to expose assumptions, catch mistakes, and correct biases.
  3. Consider the larger category before looking at the particular case.
  4. Revise your beliefs often, and in small increments, to reduce the risks of both over- and under-reacting to the news.
  5. Find merit in opposing viewpoints
  6. Reject the illusion of certainty and learn to think in degrees of uncertainty.
  7. Avoid being either a blowhard or a waffler. Aim to be prudently decisive.
  8. Learn from experience, whether success or failure.
  9. Use precision questioning to bring out the best in others--and to let others bring out the best in you.
  10. Try, fail, analyze, and adjust. And try again.
  11. There are no universally correct commandments, including these. Question everything.

Buy Superforecasting on Amazon

Book Notes

The following are rough notes I took while reading. These are mostly paraphrased or quoted directly from the book. My notes are in italics

An Optimistic Skeptic

We are all forecasters. Changing jobs, getting married, buying a home, are all decided based on how we expect the future will unfold.

The news media delivers forecasts without reporting, or even asking, how good the forecasters really are.

Forecasting is a skill that can be cultivated. This book can show you how.

The average expert was roughly as accurate as a dart-throwing chimpanzee. This doesn't mean everyone failed to beat chance.

Easiest to beat chance on short-range questions looking one-year out--approaches chimpanzee levels three to five years out.

Thanks to the lack of rigor in so many forecasting domains, this opportunity is huge. And to seize it, all we have to do is set a clear goal--accuracy!--and get serious about measuring.

Two key conclusions of research:

  1. Foresight is real--superforecasters can measurably predict events three months to a year and a half in advance.
  2. Forecasting is the product of a particular way of thinking

A sixty minute tutorial improved accuracy by 10%

Superforecasting demands thinking that is open-minded, careful, curious, and--above all--self-critical.

Illusions of Knowledge

It was the absence of doubt that made medicine unscientific and caused it to stagnate for so long.

The only alternative to a controlled experiment is an uncontrolled experiment that produces merely the illusion of insight.

Dual-system model of the brain:

  • System 2: Conscious thought -  slow, takes effort
  • System 1: Automatic perceptual and cognitive operations - Fast, constantly operating - designed to jump to conclusions from little evidence

We have a compulsion to explain our actions, to conjure a plausible story. The problem is we move too fast from confusion and uncertainty to a clear and confident conclusion.

"declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true" - Daniel Kahneman

Bait and Switch: When faced with a hard question, we often replace it with an easy one.

The choice isn't either system 1 or system 2, it is how to blend them in evolving situations

Whether intuition generates delusion or insight depends on whether you work in a world full of valid cues you can unconsciously register for future use.

the cure: a tablespoon of doubt.

Keeping Score

A forecast without a time-line is absurd. And yet, forecasters routinely make them.

Brier Scores: distance between what you forecast and what actually happened (lower is better)

Also need benchmarks and comparability.

Benchmarks: Can the forecaster do better than a mindless prediction (The weather in June in Phoenix is 100% hot and sunny)? Can the forecaster beat other forecasters?

"The fox knows many things but the hedgehog knows one big thing" - Archilochus

Foxes beat hedgehogs on both calibration and resolution.

The Hedgehog "knows one big thing". Think of that Big Idea like a pair of glasses the hedgehog never takes off.

When hedgehogs made forecasts on their own specialties their accuracy declined.

The more famous an expert was, the less accurate he was. (Less confident aggregation of many perspectives is bad TV)

Surowiecki's 'The Wisdom of the Crowds' - Aggregation of judgments of many people who know at least a little leads to impressive results.

Foxes aggregate many perspectives just like the crowd.

Aggregation doesn't come to us naturally

Superforecasters

Common Bait and Switch: "Was it a good decision?" =/= "Did it have a good outcome?"

IARPA: Intelligence Advanced Research Projects Activity. Funds research to make the intelligence community more effective.

IARPA held a forecasting tournament, the Good Judgment Project (with ordinary people) beat out intelligence communities.

Are superforecasters just lucky? (With enough people someone will flip 'Heads' 100 times in a row). No, because regression to the mean for superforecasters was slow, or even reversed.

Slow regression to the mean is more often seen in activities dominated by skill, while faster regression is more associated with chance.

Superforecasters are not infallible, but their results indicate skill, not just luck.

Supersmart?

Superforecasters score higher than 80% of population on intelligence and knowledge tests

Big jump was from public to forecasters, not forecasters to superforecasters, still well below genius territory (if you're reading this you probably have the right stuff)

Fermi estimation: (How many piano tuners are there in Chicago?) Break the question down into knowable and unknowable.

Find a base rate ("outside view") first--how common something is within a broader class. (How likely is it that the Renzettis have a pet? Don't focus on their heritage or how many people in the family, find the base rate of pet ownership).

When you do start investigating the "inside view" make it an investigation with specific questions (Fermi-ize it), don't amble.

Other ways to gain new perspectives:

  • Write it down
  • Assume your initial judgment is wrong
  • Tweak the wording

For superforecasters, beliefs are hypothesis to be tested, not treasures to be guarded.

Superquants?

Superforecasters have a way with numbers (score high on numeracy tests)

Amos Tversky: Most people have three settings when dealing with probabilities: "gonna happen", "not gonna happen", "maybe"

Most people intuitively translate 80% as "going to happen" even if they can explain that there is a 20% chance it won't happen.

Uncertainty is an ineradicable element of reality

Epistemic uncertainty: something you don't know, but is in theory knowable.

Aleatory uncertainty: something you don't know, but is unknowable.

Forecasters who frequently use 50% are less accurate.

The more granular the prediction, the better the forecast.

Those who believe in fate undermine their ability to think probabilistically

A probabilistic thinker is more interested in "how" than "why"

Finding meaning in events is positively correlated with well being but negatively correlated with foresight. Is misery the price of accuracy?

Supernewsjunkies?

Unpack the question into components. Distinguish sharply between known and unknown. Leave no assumptions unscrutinized. Adopt the outside view. Then adopt the inside view that plays up the uniqueness of the problem. Explore the similarities and differences between your views and those of others. Synthesize all these different views. Express your judgment as precisely as you can.

Superforecasters update their predictions much more frequently than regular forecasters, and in smaller increments

Superforecasters initial forecasts were at least 50% more accurate than those of regular forecasters.

Can under or over react to new information.

Our judgment about risks are more driven by our identities than by a careful weighing of evidence (people's views on gun control often correlate with their views on climate change, even though they are unrelated)

Superforecasters advantage: not professionals or experts, so they have little ego invested in each forecast.

Dilution effect: irrelevant information can bias our thinking.

Bayesian belief-updating equation: Posterior Odds = Likelihood ratio * Prior Odds. Gradually get closer to the truth by constantly updating in proportion to the weight of the evidence.

Superforecasters understand the principles but also know that their application requires nuanced judgments. They would rather break the rules than make a barbarous forecast.

Perpetual Beta

Superforecasters have a "growth mindset".

Need 'tacit knowledge'. Try and fail, but make sure your practice is informed, and get feedback.

Need specific practice, to get better at forecasting political events, forecast political events

People often assume that when a decision is followed by a good outcome, the decision was good, which isn't always true, and can be dangerous if it blinds us to the flaws in our thinking.

Superforecasters are "perpetual beta"--like a software program that will be used, analyzed, and improved without being released in a final version.

Portrait of the modal superforecaster:

  • Cautious
  • Humble
  • Nondeterministic
  • Actively Open-Minded
  • Intelligent and Knowledgeable, with a "need for cognition"
  • Reflective
  • Numerate
  • Pragmatic
  • Analytical
  • Dragonfly-eyed: Value diverse views
  • Probabilistic
  • Thoughtful Updaters
  • Good Intuitive Psychologists
  • A Growth Mindset
  • Grit

The strongest predictor of superforecasters is the degree to which one is committed to belief updating and self-improvement (3x better predictor than next rival, intelligence)

Superteams

The team that bungled the Bay of Pigs was the team that performed brilliantly during the Cuban Missile Crisis.

Groups can be wise, mad, or both.

Practice "constructive confrontation"

Teams were 23% more accurate than individuals

Teams can cover more ground and each team member brings something different

Superteams beat prediction markets by 15% to 30%

Extremizing algorithm (Taking team estimate and pushing it closer to 0% or 100% results in more accurate forecasts) depends on team diversity

The Leader's Dilemma

How can a leader appear confident and inspire confidence if they see nothing as certain?

No plan survives contact with the enemy

Auftragstaktik: "Mission Command". Commanders tell subordinates what their goal is but not how to achieve it.

The fundamental message: think. If necessary, discuss or criticize your orders. If you absolutely must, disobey them. Once a decision has been made, then forget uncertainty and complexity

The humility required for good judgment is not self-doubt. It is intellectual humility. It's possible to think highly of yourself and be intellectually humble.

Are They Really so Super?

We are all vulnerable to 'What you see is all there is' biases.

We are one system 2 slip-up away from a blown forecast. Tetlok is optimistic that people can inoculate themselves to some degree against certain cognitive illusions.

Scope insensitivity: ignoring scope (say, time frame) when making estimate. Superforecasters were better.

Tetlok's sense is that some superforecasters are so well practiced in system 2 corrections that these techniques have become habitual.

Black Swans?

False dichotomy: "forecasting is feasible if you follow my formula" vs "forecasting is bunk"

History is not completely run by black swans like Taleb suggests (life expectancy, annual growth)

If you have to plan for a future beyond the forecasting horizon, plan for surprise.

Humility should not obscure the fact that people can, with considerable effort, make accurate forecasts about at least some developments that really do matter.

What's next?

What if the Good Judgment Project was just testing tiny events that don't really matter? We Can use question clustering to help predict big events with smaller questions.

All we have to do is get serious about keeping score.

Buy Superforecasting on Amazon


Additional Reading: