When Rational Beliefs Become Irrational Together
Imagine a fair lottery with 1,000 tickets. Exactly one ticket will win. You know this for certain.
Now consider any specific ticket - say, ticket #472. What's the probability it will lose? 99.9%. Is it rational to believe it will lose? Most people say yes.
But here's the problem: this reasoning applies to every ticket. You can rationally believe each individual ticket will lose. Yet you KNOW one ticket must win!
The contradiction:
✓ Rational to believe: "Ticket 1 will lose"
✓ Rational to believe: "Ticket 2 will lose"
⋮
✓ Rational to believe: "Ticket 1000 will lose"
∴ Rational to believe: "NO ticket will win" — But one MUST win!
Click on tickets to mark your belief that they'll lose. Watch what happens when you believe rationally about each one...
The paradox reveals a tension between three seemingly reasonable principles:
If something is very likely (say, > 99%), we're justified in believing it will happen. This seems basic to rational reasoning.
If you rationally believe A, and rationally believe B, then you should rationally believe "A and B." Beliefs should combine coherently.
Rational belief sets must be consistent. You can't rationally believe something you know to be false.
All three principles seem correct, yet together they lead to irrationality. At least one must go.
Henry E. Kyburg Jr. publishes "Probability and the Logic of Rational Belief," introducing the lottery paradox. He argues against deductive closure for rational acceptance.
Debate intensifies. Some philosophers defend closure; others follow Kyburg in rejecting it. The paradox becomes central to formal epistemology.
Keith DeRose and others connect the lottery paradox to the preface paradox (where an author believes each statement in their book but knows some must be wrong).
New solutions emerge: contextualism, interest-relative invariantism, and permissibility-based approaches. Debate continues actively.
Only believe what's probability 1 (certain). But this makes most beliefs irrational - you can't believe the sun will rise!
Believing A and believing B doesn't require believing "A and B." Beliefs don't have to aggregate. But this seems to fragment rationality.
"Rational belief" means different things in different contexts. When focusing on one ticket, belief is rational; when considering all, it's not.
You're permitted to believe each ticket loses, but permissions don't aggregate. You can hold individual beliefs without combining them.
Only close under conjunctions that preserve high probability. "A and B" inherits justification only if P(A∧B) is also high enough.
You don't actually know any ticket will lose (you just assign high probability). True beliefs require knowledge, not just high credence.
Consider an author who writes a long book. She believes each statement in the book is true (otherwise she wouldn't have written it). Yet she also believes, based on humility and experience, that the book contains at least one error.
This is structurally identical to the lottery paradox:
Both paradoxes show that rational beliefs about parts don't necessarily combine into rational beliefs about wholes.
The lottery paradox isn't just a puzzle - it challenges fundamental assumptions about rational thinking:
"The lottery paradox calls into question some of our most basic assumptions about rational belief."
— Stanford Encyclopedia of Philosophy
"It's rational to believe this ticket will lose. And this one. And this one. And... wait."
— Every lottery player, moments before the paradox strikes