Information Is Physical
In 1961, IBM physicist Rolf Landauer made a profound discovery: erasing information has a fundamental energy cost. This isn't an engineering limitation—it's a law of physics as fundamental as conservation of energy.
This tiny amount—about 0.017 electron volts at room temperature—represents the absolute minimum energy required to erase one bit. No cleverness, no future technology can circumvent this limit.
Why Does Forgetting Cost Energy?
The key insight is logical irreversibility. When you erase a bit (reset it to 0), you lose information about what it was before. Whether it was 0 or 1, the result is the same: 0.
The Second Law of Thermodynamics demands that entropy cannot decrease in an isolated system. When we destroy information (decrease entropy in the memory), we must increase entropy elsewhere—as heat dumped into the environment.
The Mathematics
Before erasure, the bit has 2 possible states with equal probability. The Shannon entropy is:
After erasure, there's only 1 state (certainty), so S_final = 0. The entropy decrease of the bit is k_B ln(2), which must appear as heat in the thermal reservoir.
Exorcising Maxwell's Demon
For over a century, Maxwell's Demon seemed to violate the Second Law. The demon sorts fast and slow molecules, creating a temperature difference from equilibrium—apparently for free.
Landauer and Charles Bennett (1982) showed the resolution: the demon must remember which molecules went where. After enough sorting, the demon's memory fills up. To continue, it must erase its memory—and THIS is where the entropy increase happens!
Experimental Verification
In 2012, a team from France and Germany achieved the first direct measurement of Landauer's limit. Using a tiny silica bead trapped by laser light as a one-bit memory, they measured heat dissipation approaching k_B T ln(2) per erasure.
In 2018, researchers extended verification to the quantum realm, using molecular nanomagnets at 1 Kelvin. Each magnet acted as a qubit, and erasure heat matched Landauer's prediction.
Reversible Computing
Landauer's principle inspired reversible computing— computation that preserves information and theoretically dissipates zero heat. If every logical operation is invertible (like a Toffoli gate), no information is destroyed, and no Landauer heat is generated.
Modern CPUs dissipate about 10⁶ times more energy per operation than Landauer's limit. As transistors shrink, approaching this limit becomes increasingly important.
The Deeper Meaning
Landauer's principle unifies information theory and thermodynamics. It says information isn't just abstract—it has physical embodiment and physical consequences.
This connection runs deep. The Bekenstein bound limits information density (before a black hole forms). Hawking radiation may encode information about what fell into black holes. Even quantum mechanics' measurement problem involves information—when does the "erasure" of superposition occur?
Implications for Computing
At room temperature, Landauer's limit is about 3×10⁻²¹ J per bit. For perspective:
• A modern CPU erasing 10¹⁰ bits/second at Landauer limit: 30 nanowatts
• Actual modern CPUs: ~100 watts (10⁹ times higher!)
• Human brain (~10¹⁵ synapses): ~20 watts (remarkably efficient)
As we approach the physical limits of computation, Landauer's principle becomes the ultimate constraint on information processing.
Sources
- Landauer, R. (1961). "Irreversibility and Heat Generation in the Computing Process" - IBM Journal
- Bennett, C.H. (1982). "The Thermodynamics of Computation—a Review" - Int. J. Theor. Phys.
- Bérut, A. et al. (2012). "Experimental verification of Landauer's principle" - Nature
- Physics Today: "Information: From Maxwell's demon to Landauer's eraser"
- Quantum Zeitgeist: "The Landauer Limit: Why Erasing A Bit Generates Heat"