Seven interactive exhibits on the nature of information β from talking drums to quantum bits.
"The bit now joined the inch, the pound, the quart, and the minute
as a determinate quantity β a fundamental unit of measure."
The Kele people of the Congo encoded speech into drum patterns using only two tones β high and low. But with only two tones, hundreds of words sound identical. The solution: redundancy. Each word is wrapped in a stock phrase that disambiguates it.
"The moon looks down at the earth" means moon.
"The fowl, the little one that says kiokio" means fowl.
Messages expanded roughly 8Γ to become unambiguous β centuries before information theory explained why.
Type a word to see its drum encoding. Notice: the two-tone pattern is ambiguous without the surrounding phrase.
These words all share the same two-tone pattern. Click any to see its disambiguating phrase:
Shannon defined the bit β binary digit β as the fundamental unit of information: the amount of uncertainty resolved by a fair coin flip. Everything measurable can be expressed in bits.
"Information is surprise."
Click to explore the information content of different things:
Adjust the probability of a coin landing heads. A fair coin = 1 bit. A rigged coin carries less information β because the outcome is less surprising.
How many equally likely options? We need logβ(n) bits to distinguish them.
In his 1937 master's thesis β possibly the most important master's thesis ever written β Claude Shannon showed that Boolean algebra maps directly onto electrical switching circuits. What a relay passes onward is "not really electricity but rather a fact: the fact of whether the circuit is open or closed."
Circuits in series = AND. Circuits in parallel = OR. An inverting relay = NOT.
From these three primitives, all computation follows.
Combine gates to add two single-bit numbers. Sum = A XOR B. Carry = A AND B.
Shannon proved that roughly half of English text is predictable β and therefore informationally redundant. He tested this by having his wife Betty guess the next letter in a Raymond Chandler novel. The result: each character carries only ~2.3 bits instead of the theoretical ~4.7.
"You cn rd ths sntnc wth hlf th lttrs mssng."
Try guessing the next letter, just like Betty Shannon. How well can you predict English?
DNA uses a 4-symbol alphabet (A, T, C, G) arranged in 3-letter "words" called codons. 64 possible codons map to 20 amino acids β plus start and stop signals. It is, quite literally, a digital code.
"What lies at the heart of every living thing is not a fire, not warm breath, not a 'spark of life.' It is information, words, instructions." β Richard Dawkins
SchrΓΆdinger predicted this in 1943, calling it an "aperiodic crystal" β a structure with enough irregularity to carry a message. Watson and Crick proved him right a decade later.
Enter a DNA sequence (A, T, C, G) and watch it translate into amino acids:
4 bases Γ 3 positions = 64 codons β 20 amino acids. The redundancy is built in β most amino acids have multiple codons (error tolerance, just like the drums).
Shannon estimated the human genome at ~10β΅ bits in 1949 β years before Watson & Crick. The actual figure: ~6.4 billion base pairs Γ 2 bits each β 1.28 Γ 10ΒΉβ° bits. But with redundancy and compression, the meaningful information content is far less.
The history of information is not a story of increasing speed. Each new technology β from speech to writing to the internet β didn't just transmit ideas faster. It changed what humans could think.
"It was something like a thunder-clap in human history, which the bias of familiarity has converted into the rustle of papers on a desk." β Eric Havelock, on writing
This is the book that connects everything. It's the theoretical backbone β the explicit statement of what all the other books gesture at: that information is the substrate, pattern is the signal, and meaning emerges from structure.
Gleick writes like a novelist about mathematics and physics. The book traces a single thread across 400+ pages: the gradual human discovery that information is a thing β measurable, physical, and possibly the substrate of reality itself. From African drums to quantum computing, from Babbage to Shannon to Wheeler's "It from Bit," each chapter reveals another facet of the same diamond.
No wasted paragraphs. Every anecdote earns its place. The talking drums chapter alone β showing that illiterate drummers independently solved error-correction coding through centuries of cultural evolution β is worth the price of admission.
Deepest connection. GΓΆdel's encoding (mapping mathematics onto itself) is the same move as Shannon's (mapping logic onto circuits). Strange loops everywhere: information about information, codes that encode codes.
Writing as cognitive technology = Papert's "objects to think with." Each information revolution creates new microworlds for thought. The illiterate Uzbeks who can't process syllogisms are Papert's thesis made devastating.
Each city is a compressed information packet β maximum meaning in minimum symbols. The Khan's chessboard mirrors Gleick's thesis: concrete β abstract β meaning recovered from the grain of wood.
Chapter 15 IS Postman's nightmare β the flood of information without meaning. The gap between information and knowledge is Postman's central crisis. Shannon deliberately excluded meaning; Postman spent his career insisting it was all that mattered.
Turing's universal machine is the theoretical foundation. Recursion, data as code, the idea that a few primitives build arbitrary complexity. Babbage β Turing β lambda calculus β Scheme.
Shannon's maze-solving mouse is literally a Braitenberg vehicle β 75 relays producing behavior that looks like learning, memory, goal-seeking. Simple mechanism, complex behavior.
"Every it β every particle, every field of force, even the space-time continuum itself β derives its function, its meaning, its very existence entirely from binary choices, bits." β John Archibald Wheeler