The Ellsberg Paradox
Why we fear the unknown more than the risky
The Setup
Imagine an urn containing 90 balls. You know exactly 30 are red. The remaining 60 are some mix of black and yellow—but you don't know the ratio. It could be 60 black and 0 yellow, or 0 black and 60 yellow, or anything in between.
You'll be paid $100 if the ball you draw matches your bet. Which gambles would you prefer?
Daniel Ellsberg (yes, the Pentagon Papers whistleblower) discovered in 1961 that people's choices in this scenario violate the fundamental axioms of rational decision theory—yet feel completely reasonable.
🎱 The Ellsberg Experiment 🎱
Make your choices, then see the paradox revealed
The Paradox
Most people prefer A over B (known 33% vs unknown black) and D over C (known 67% vs unknown red+yellow). This seems reasonable—we prefer known odds!
But here's the contradiction:
But you preferred D! Your preferences are logically inconsistent. You can't simultaneously believe black is less likely than red AND that black+yellow is more likely than red+yellow.
Ambiguity Aversion
The Ellsberg Paradox reveals that humans don't just dislike risk—we have a separate, powerful aversion to ambiguity (not knowing the probabilities at all).
Risk vs. Ambiguity
- Risk: Known probabilities (a fair coin: 50/50)
- Ambiguity: Unknown probabilities (a coin that might be biased, but you don't know how)
Standard expected utility theory treats these the same—a rational agent should assign subjective probabilities and maximize expected value. But humans systematically avoid ambiguity, even when it means accepting objectively worse odds.
Why Does This Violate Rationality?
The Sure-Thing Principle (one of the axioms of rational choice) says: if you prefer A to B when some event E happens, and you also prefer A to B when E doesn't happen, then you should prefer A to B regardless of E.
In the Ellsberg case:
- If yellow balls exist: red+yellow should be preferred (more chances to win)
- If no yellow balls: red+yellow = red = same as gamble A
- Yet people prefer the "safe" black+yellow (gamble D) in both cases!
Real-World Applications
The Man Behind the Paradox
Daniel Ellsberg is better known for leaking the Pentagon Papers in 1971—classified documents revealing government deception about the Vietnam War. But before becoming a whistleblower, he was a brilliant decision theorist at RAND Corporation.
His 1961 paper "Risk, Ambiguity, and the Savage Axioms" challenged the foundations of rational choice theory. Though initially controversial, his insights are now fundamental to behavioral economics, earning him recognition as a pioneer in understanding how humans actually make decisions under uncertainty.
Interestingly, John Maynard Keynes described a version of this paradox as early as 1921, but it was Ellsberg who formalized it and explored its implications for economic theory.
Theoretical Resolutions
Maxmin Expected Utility
Instead of assuming a single probability distribution, assume the decision-maker considers the worst-case scenario across all possible distributions. This "pessimistic" approach can rationalize ambiguity aversion.
Choquet Expected Utility
French mathematician Gustave Choquet developed a generalized integral that allows for non-additive probabilities, accommodating the kind of behavior seen in the Ellsberg experiment.
The Lesson
The Ellsberg Paradox doesn't mean humans are irrational—it means our intuitive definition of "rational" was incomplete. We've since developed richer theories of decision-making that account for ambiguity, leading to better models in economics, AI, and policy design.