Superforecasting
The practice of making highly accurate predictions through disciplined thinking, probability estimation, and continuous calibration.
Also known as: Super Forecasting, Good Judgment, Calibrated Forecasting
Category: Decision Science
Tags: decision-making, forecasting, intelligence, mental-models, probabilistic-thinking, thinking
Explanation
Superforecasting is the practice and science of making unusually accurate predictions about future events, as described by Philip Tetlock and Dan Gardner in their influential 2015 book. The concept emerged from the Good Judgment Project, a multi-year forecasting tournament sponsored by the U.S. intelligence community (IARPA) that compared thousands of forecasters to determine what distinguishes the most accurate predictors.
**The discovery:**
Tetlock's earlier research, documented in 'Expert Political Judgment' (2005), had shown that the average expert's predictions were barely better than chance, and worse than simple statistical algorithms. The Good Judgment Project set out to find whether some individuals could consistently beat these baselines. The answer was yes: a small group of 'superforecasters' achieved accuracy levels roughly 30% better than intelligence analysts with access to classified information.
**What makes superforecasters different:**
- **Probabilistic thinking**: They assign precise numerical probabilities rather than vague verbal predictions ('likely,' 'possible'). They think in fine-grained odds, distinguishing between a 60% and a 65% chance
- **Incremental updating**: When new information arrives, they adjust their estimates in small, measured steps rather than overreacting or ignoring it. This mirrors Bayesian reasoning
- **Perspective-taking**: They actively seek out views that challenge their own assumptions and consider problems from multiple angles
- **Decomposition**: They break complex questions into smaller, more tractable sub-problems. Instead of asking 'Will Russia invade Ukraine?', they ask about troop movements, diplomatic signals, economic incentives, and historical patterns separately
- **Intellectual humility**: They acknowledge uncertainty, treat their beliefs as hypotheses to be tested rather than convictions to be defended, and are willing to change their minds
- **Growth mindset**: They treat forecasting as a skill to be practiced and improved, not an innate talent
**Dragonfly eye perspective:**
Tetlock describes the superforecaster's cognitive style as a 'dragonfly eye' — seeing the same problem from many perspectives simultaneously and integrating these views. This contrasts with the 'hedgehog' style of experts who view the world through a single overarching theory.
**Connection to collective intelligence:**
Superforecasting leverages collective intelligence through team-based forecasting. Tetlock found that teams of superforecasters performed even better than individual superforecasters, because good team processes (constructive disagreement, information sharing, accountability) further reduced bias and error. Prediction markets represent another mechanism for aggregating forecaster judgments.
**Practical applications:**
- **Strategic planning**: Using calibrated probability estimates for scenario planning and risk assessment
- **Decision-making**: Making better decisions under uncertainty by explicitly quantifying risks and likelihoods
- **Organizational forecasting**: Building internal prediction capabilities for business outcomes
- **Personal calibration**: Tracking predictions to improve one's own judgment over time
**Key takeaway:**
Forecasting accuracy is a learnable skill, not a mystical gift. The techniques of superforecasting — probabilistic thinking, active open-mindedness, incremental updating, and continuous calibration — can be practiced and improved by anyone willing to invest the effort.
Related Concepts
← Back to all concepts