DACHVARD

~/library~/writing~/author~/wander
SHANNON · CHIANG

Entropy

On information, disorder, and what we lose

In 1948, Claude Shannon published a paper that changed what information means. He asked: how do you measure the surprise in a message? His answer — entropy — is a function of probability distributions, not meaning. A message from a fair coin contains exactly one bit of entropy. A message from a loaded coin contains less. The more predictable the source, the less it has to say.

Ted Chiang's story “Exhalation” follows an alien anatomist who discovers that the air powering his civilization — and consciousness itself — flows from high pressure to low. Thought requires a pressure gradient. When equilibrium is reached, all minds will cease. It is a story about thermodynamic entropy: the universe winding irreversibly toward sameness. Shannon's entropy and Boltzmann's are siblings — both measure how many ways a system can be arranged. The link between them is not metaphor; it is mathematics.

Entropy in Bits

H(X) = −∑ p(x) log₂ p(x)  ·  measured in bits

SourceFormulaH (bits)Visual
Fair coin flip
Maximum uncertainty for two outcomes
H = −(0.5 log₂0.5 + 0.5 log₂0.5)1.000
13% of max (8 bits)
Biased coin (90/10)
Less surprise; outcome is mostly predictable
H = −(0.9 log₂0.9 + 0.1 log₂0.1)0.469
6% of max (8 bits)
English text
Redundancy of natural language
H ≈ 1.0–1.5 bits/letter1.300
16% of max (8 bits)
Uniformly random byte
Maximum: every symbol equally likely
H = log₂(256)8.000
100% of max (8 bits)
A single known outcome
Zero entropy: no information in certainty
H = −(1.0 log₂1.0)0.000
0% of max (8 bits)
· · ·

From the Source

“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.”

— Claude E. Shannon, A Mathematical Theory of Communication, 1948

Shannon was not interested in what messages mean — only in how many of them there are. The entropy formula he derived is identical in form to Boltzmann's formula for thermodynamic entropy, a coincidence Shannon mentioned to von Neumann, who reportedly said: call it entropy, because nobody knows what entropy is, so you will always win an argument.

“My thoughts are not diminished in themselves, but I know that those who come after will contemplate a version of themselves less vivid than what I experience now...”

— Ted Chiang, Exhalation (paraphrased)

Chiang's narrator, having dissected his own brain, realizes consciousness runs on pressure differentials in metallic air. The revelation is not that the universe ends — it is that the universe ends slowly, beautifully, and with full knowledge of itself. The entropy increases. The pressure equalizes. Thought continues until it cannot.

· · ·
For any closed system, entropy does not decrease. Every message sent, every thought thought, every breath exchanged — increases the entropy of the universe by a small but nonzero amount.
After Boltzmann, 1872. After Shannon, 1948. After Chiang, 2008. This page, by existing, contributes to the total.

What we lose is not gone — it is dispersed. Every message increases the entropy of the universe by a small but nonzero amount.

✦ memory · ☽ night · ∞ loops · ❧ margins · ◆ proof

a personal library in perpetual arrangement  ·  MMXXVI