Entropy: Mixing Gases and the Second Law
Problem
Two gases of different colors are separated by a barrier in a box. When the barrier is removed, the gases mix. Compute the entropy change using Boltzmann's formula S = k_B ln Ω, and show why the mixing is statistically irreversible.
Explanation
When you drop a drop of ink into a glass of water, the ink spreads out and colors the water uniformly. You never see the ink spontaneously un-spread back into a drop. When you open a bottle of perfume in a room, the scent fills the air over a few minutes — you never see the molecules rush back into the bottle. When you mix cream into coffee, you can stir it together, but you can't un-mix it by stirring.
All of these everyday observations are expressions of the second law of thermodynamics: some processes happen spontaneously in one direction and never the reverse. The quantity that makes this precise is called entropy, and there is one equation — attributed to Ludwig Boltzmann and carved into his tombstone in Vienna — that tells you what entropy really is:
where is Boltzmann's constant and is the number of microstates that correspond to a given macrostate. That's a packed sentence, so let me unpack it carefully.
Microstates vs macrostates — the central distinction
Imagine a box divided in half by a removable barrier. On the left are 4 red balls. On the right are 4 blue balls. You take out the barrier.
- A microstate is the full, detailed description of every single ball: which position it's in, which direction it's moving. If you could see every ball individually, a microstate tells you exactly what the system looks like down to the last detail.
- A macrostate is a "big picture" description. Examples: "all 4 reds on the left, all 4 blues on the right" — or "reds and blues evenly mixed" — or "3 reds on the right and 2 blues on the left and…"
For a macrostate description, many different microstates can be consistent with it. The macrostate "evenly mixed" doesn't specify which specific red ball is where — it only says the overall distribution is 50/50. So many specific ball-by-ball arrangements all count as "evenly mixed."
Here's the key insight: some macrostates have way more microstates than others.
Let me count for a simple example. Suppose you have 4 red balls and 4 blue balls, 8 positions in the box (4 on each side). The macrostate "all 4 reds on the left" can only happen one way (given the box positions, each red ball has a fixed slot on the left). The macrostate "2 reds left, 2 reds right" has to choose which 2 of the 4 reds go on which side — ways.
Scale this up to a box with particles that can be on either side. The number of microstates with exactly particles on the left is the binomial coefficient . This distribution is sharply peaked at — the "evenly mixed" macrostate has overwhelmingly more microstates than any other.
Why mixing increases entropy
Plug into Boltzmann's formula: . Since grows dramatically as you move from "separated" to "mixed," entropy grows correspondingly. That's the entire story. Mixing increases entropy because there are way more ways to be mixed than to be separated.
How much more? For red particles and blue particles mixing, the initial separated state has (each particle has its designated side — only one arrangement). The fully-accessible mixed state has microstates (by Stirling's approximation). The ratio:
For : about . For : about . For mole (), the number of microstates has about digits. That's not "lots more microstates." That's an unimaginably larger number — so astronomical that ordinary intuitions about probability break down completely.
Method 1: Entropy increase via the Boltzmann formula
Take the ratio of final to initial microstates and plug in:
For (one mole of each gas):
For the leading behavior, . Multiply by and use :
Numerically: .
That's the entropy increase when 1 mole of red gas and 1 mole of blue gas mix in the ratio 1:1. A specific, measurable, finite number.
Method 2: Entropy increase via the thermodynamic formula
There's a completely different way to compute the same number, using the thermodynamic definition of entropy changes, that doesn't require combinatorics at all.
Before the barrier is removed, each gas occupies half the box — say volume . After the barrier is removed, each gas has access to the entire box, volume . Each gas has effectively undergone an isothermal expansion from to (the temperature stayed the same throughout, since the other gas was at the same temperature to begin with). For an isothermal expansion of an ideal gas:
For each mole of each gas:
Same answer, completely different route. The combinatorial formula (counting microstates) and the thermodynamic formula (each gas expanding into the full volume) both give . This agreement is not a coincidence — it's a deep consistency check of statistical mechanics, and it's one of the reasons we believe Boltzmann's formula is fundamentally correct.
The probability of spontaneous unmixing
If entropy only tells us "mixed is more likely than separated," couldn't the system just happen to unmix by random chance? In principle yes. In practice the probability is so small that it never happens in the lifetime of the universe.
The probability of finding all particles of one color on their original side (spontaneously unmixing) is:
For various :
- : (about 5 in a million)
- : (about 7 in a trillion)
- : (one in — a nonillion of nonillions)
- (Avogadro-scale):
For , if you watched the gas every nanosecond for the entire current age of the universe ( seconds), you would perform about observations. The chance of seeing a single spontaneous unmixing is effectively zero to any practical precision.
So the second law isn't strictly absolute — there's a microscopic probability of violation — but that probability is so outrageously small that the distinction is meaningless. The second law is a statistical certainty with more decimal places of reliability than any physical measurement could ever test.
The second law of thermodynamics
There are several equivalent formulations of the second law, but a clean one is:
The total entropy of an isolated system never decreases.
For reversible processes (idealized, infinitely slow, no friction), the equality holds: . For irreversible processes (everything in the real world), . Mixing is a prototypical irreversible process, as is heat flow from hot to cold, friction, diffusion, and every other spontaneous macroscopic phenomenon.
The second law sets the arrow of time in physics. Newton's laws and Maxwell's equations are time-symmetric — they look the same if you run the clock backwards. But the second law is not. It picks out a direction: "entropy is increasing" is the direction we call "forward in time." Why the universe has this preferred direction is a deep question, ultimately connected to the very low-entropy state the universe started in at the Big Bang.
Why this picture works so well
Notice something subtle: we don't need to know anything about the forces between particles, the shapes of the molecules, or any detailed physics. The Boltzmann formula just counts microstates. It's almost purely a combinatorial/geometric statement about the space of possibilities, not about what any specific particle is doing.
This is why statistical mechanics can make predictions about wildly different systems — gases, magnets, neutron stars, black holes — using the same underlying framework. As long as you can count the microstates (or at least estimate their growth rate with energy), you can get the thermodynamic properties without solving Newton's equations for particles.
Real-world examples
- Ink drop in water. Millions of ink molecules in a drop become distributed across trillions of water molecules. The entropy increases by roughly per mole of ink. For a 1 mL drop in a 1 L glass, that's , so — substantial, and spontaneous.
- Perfume in a room. Initially localized in the bottle, eventually distributed through the room. Same math, larger volume ratio. Every time you open a bottle of perfume, you're performing a huge entropy increase that you'll never see reversed in your lifetime.
- Heat flowing from hot to cold. A cup of hot coffee on a cold table. Heat flows from the coffee (temperature ) to the table and air (temperature ). The hot object loses of entropy; the cold object gains . Since , the gain is larger than the loss. Net . Heat flows the other way — cold to hot — would require , which (by the second law) never happens spontaneously.
- Ice melting. The liquid water at 0°C has more microstates than the frozen ice at 0°C (molecules can translate, not just vibrate around fixed lattice points). At temperatures above freezing, the entropy gain from melting outweighs the energy cost, so melting proceeds spontaneously.
- Metabolism and life. Living organisms create local decreases in entropy — growing complex structures, organizing molecules, building proteins. But they pay for these decreases by exporting more entropy to the environment, mostly as heat. The total entropy of "organism + environment" always increases. Life exploits the second law; it doesn't violate it.
- The arrow of time. Why do we remember the past and not the future? Why do broken eggs never reassemble? The modern answer is that the universe started in a very low-entropy state (the hot, dense, uniform Big Bang) and is slowly evolving toward higher entropy. Our psychological experience of time moving "forward" is aligned with the direction of entropy increase.
Gibbs paradox and identical particles
Here's a subtle puzzle: what happens if the two "gases" being mixed are actually the same gas? Say you remove a barrier between two halves of a box, both containing pure oxygen at the same temperature and pressure. Does the entropy increase?
Classically, if you treated the O₂ molecules as distinguishable, the formula would say yes — just like mixing red and blue gases. But physically, nothing observable has changed by removing the barrier. The gas looks identical before and after. There's no way to extract work from the "mixing" or to un-mix by any measurement.
The resolution — due to Gibbs — is that identical particles are truly indistinguishable. Swapping two oxygen molecules doesn't produce a new microstate; it's literally the same microstate because there's no way, even in principle, to tell which molecule is which. When you count microstates correctly for identical particles (which requires quantum mechanical statistics), the "entropy of mixing" for two samples of the same gas is zero, as it must be physically.
This is called the Gibbs paradox and its resolution was one of the early hints that classical physics wasn't complete — that particles are fundamentally quantum objects. For distinguishable particles (different gases), the mixing entropy is real and measurable. For indistinguishable particles (the same gas), it's zero.
Common mistakes
- Calling entropy "disorder." This is a popular simplification but it can be misleading. Entropy is really about the number of microstates consistent with a macrostate, not about subjective "messiness." A well-organized filing cabinet has low entropy not because it's neat in a human-aesthetic sense but because most of its molecules are constrained to specific positions by the structure of the cellulose and ink.
- Thinking entropy "prefers" disorder. The second law doesn't enforce anything or "push" systems toward anything. It's a purely statistical statement: a random microstate is much more likely to be in a high-entropy macrostate because there are many more such microstates. No mysterious force — just combinatorics.
- Thinking entropy can't decrease locally. It can and does. Refrigerators decrease the entropy of their contents. Your body decreases its local entropy (growing, organizing). The second law only says the total entropy of an isolated system can't decrease. A local decrease is always paid for by a larger increase elsewhere.
- Confusing entropy with heat. They're related but distinct. Entropy has units of J/K (energy per temperature). Heat has units of J. The relation (for reversible heat transfer at temperature ) connects them. Don't call heat "entropy."
- Treating as only applying to ideal gases. The formula is universal. It applies to any system where you can count (or estimate) the number of accessible microstates: solids, liquids, magnets, neutron stars, black holes. The formula is agnostic about what the system is.
- Assuming entropy is subjective. Some philosophical writing suggests that entropy depends on what the observer knows or cares about. In classical thermodynamics, there's a more objective definition: the microstates are real, and the count is a fact about the physical system, not about the observer's knowledge. (There's a more nuanced information-theoretic version where "subjective" plays a role, but that's a different conversation.)
Try it
- Slide the number of red and blue particles. At 10 particles per color you can almost count the microstates; at 100 the combinatorics are astronomical.
- Toggle the barrier. With the barrier in place, each gas is confined to its half and the entropy is at its initial value. Remove the barrier and the entropy jumps up to the mixed-state value and stays there — the system never spontaneously returns.
- Watch the entropy bar on the right side of the screen. It shows the current entropy relative to the initial separated state. Closed barrier: zero. Open barrier: J/K for 1 mole of each gas.
- Enable "Show probability of unmixing" to see the numerical probability of spontaneous separation at the current particle count. Even for just 20 particles, the probability is already tiny. For Avogadro-scale numbers, it's effectively zero.
- Enable the microstate histogram to see the binomial distribution of microstates by "how many reds are on the left." Notice how sharply peaked it is at the middle — that's why the system strongly prefers 50/50 splits.
- Try very unequal particle counts (say 5 reds vs 95 blues). The mixing entropy formula changes — see whether you can guess the pattern before the display shows it.
- Try exactly equal numbers (50 red, 50 blue). The peak of the distribution is sharpest, and the system is most committed to the mixed state.
Interactive Visualization
Parameters
Got your own math or physics problem?
Turn any problem into an interactive visualization like this one — powered by AI, generated in seconds. Free to try, no credit card required.