Entropy: Mixing Gases and the Second Law

April 12, 2026

Problem

Two gases of different colors are separated by a barrier in a box. When the barrier is removed, the gases mix. Compute the entropy change using Boltzmann's formula S = k_B ln Ω, and show why the mixing is statistically irreversible.

Explanation

When you drop a drop of ink into a glass of water, the ink spreads out and colors the water uniformly. You never see the ink spontaneously un-spread back into a drop. When you open a bottle of perfume in a room, the scent fills the air over a few minutes — you never see the molecules rush back into the bottle. When you mix cream into coffee, you can stir it together, but you can't un-mix it by stirring.

All of these everyday observations are expressions of the second law of thermodynamics: some processes happen spontaneously in one direction and never the reverse. The quantity that makes this precise is called entropy, and there is one equation — attributed to Ludwig Boltzmann and carved into his tombstone in Vienna — that tells you what entropy really is:

S=kBlnΩS = k_{B}\,\ln\Omega

where kB=1.381×1023  J/Kk_{B} = 1.381 \times 10^{-23}\;\text{J/K} is Boltzmann's constant and Ω\Omega is the number of microstates that correspond to a given macrostate. That's a packed sentence, so let me unpack it carefully.

Microstates vs macrostates — the central distinction

Imagine a box divided in half by a removable barrier. On the left are 4 red balls. On the right are 4 blue balls. You take out the barrier.

  • A microstate is the full, detailed description of every single ball: which position it's in, which direction it's moving. If you could see every ball individually, a microstate tells you exactly what the system looks like down to the last detail.
  • A macrostate is a "big picture" description. Examples: "all 4 reds on the left, all 4 blues on the right" — or "reds and blues evenly mixed" — or "3 reds on the right and 2 blues on the left and…"

For a macrostate description, many different microstates can be consistent with it. The macrostate "evenly mixed" doesn't specify which specific red ball is where — it only says the overall distribution is 50/50. So many specific ball-by-ball arrangements all count as "evenly mixed."

Here's the key insight: some macrostates have way more microstates than others.

Let me count for a simple example. Suppose you have 4 red balls and 4 blue balls, 8 positions in the box (4 on each side). The macrostate "all 4 reds on the left" can only happen one way (given the box positions, each red ball has a fixed slot on the left). The macrostate "2 reds left, 2 reds right" has to choose which 2 of the 4 reds go on which side — C(4,2)=6C(4, 2) = 6 ways.

Scale this up to a box with NN particles that can be on either side. The number of microstates with exactly nLn_{L} particles on the left is the binomial coefficient C(N,nL)=N!/(nL!(NnL)!)C(N, n_{L}) = N!/(n_{L}!\,(N-n_{L})!). This distribution is sharply peaked at nL=N/2n_{L} = N/2 — the "evenly mixed" macrostate has overwhelmingly more microstates than any other.

Why mixing increases entropy

Plug into Boltzmann's formula: S=kBlnΩS = k_{B}\ln\Omega. Since Ω\Omega grows dramatically as you move from "separated" to "mixed," entropy SS grows correspondingly. That's the entire story. Mixing increases entropy because there are way more ways to be mixed than to be separated.

How much more? For NN red particles and NN blue particles mixing, the initial separated state has Ωi=1\Omega_{i} = 1 (each particle has its designated side — only one arrangement). The fully-accessible mixed state has Ωf=(2NN)4N/πN\Omega_{f} = \binom{2N}{N} \approx 4^{N}/\sqrt{\pi N} microstates (by Stirling's approximation). The ratio:

ΩfΩi4NπN\dfrac{\Omega_{f}}{\Omega_{i}} \approx \dfrac{4^{N}}{\sqrt{\pi N}}

For N=10N = 10: about 92,37892{,}378. For N=100N = 100: about 9×10589 \times 10^{58}. For N=1N = 1 mole (6×1023\approx 6 \times 10^{23}), the number of microstates has about 3.6×10233.6 \times 10^{23} digits. That's not "lots more microstates." That's an unimaginably larger number — so astronomical that ordinary intuitions about probability break down completely.

Method 1: Entropy increase via the Boltzmann formula

Take the ratio of final to initial microstates and plug in:

ΔS=kBlnΩfΩi=kBln(2NN)\Delta S = k_{B}\ln\dfrac{\Omega_{f}}{\Omega_{i}} = k_{B}\ln\binom{2N}{N}

For NNA=6.022×1023N \approx N_{A} = 6.022 \times 10^{23} (one mole of each gas):

ΔS=kBln(2NANA)kBln4NAπNA\Delta S = k_{B}\ln\binom{2N_{A}}{N_{A}} \approx k_{B}\ln\dfrac{4^{N_{A}}}{\sqrt{\pi N_{A}}}

For the leading behavior, ln(4NA)=NAln4=2NAln2\ln(4^{N_{A}}) = N_{A}\ln 4 = 2 N_{A}\ln 2. Multiply by kBk_{B} and use kBNA=Rk_{B}N_{A} = R:

ΔS2Rln2\Delta S \approx 2R\,\ln 2

Numerically: 2×8.314×0.693111.53  J/K2 \times 8.314 \times 0.6931 \approx 11.53\;\text{J/K}.

That's the entropy increase when 1 mole of red gas and 1 mole of blue gas mix in the ratio 1:1. A specific, measurable, finite number.

Method 2: Entropy increase via the thermodynamic formula

There's a completely different way to compute the same number, using the thermodynamic definition of entropy changes, that doesn't require combinatorics at all.

Before the barrier is removed, each gas occupies half the box — say volume VV. After the barrier is removed, each gas has access to the entire box, volume 2V2V. Each gas has effectively undergone an isothermal expansion from VV to 2V2V (the temperature stayed the same throughout, since the other gas was at the same temperature to begin with). For an isothermal expansion of an ideal gas:

ΔSone gas=nRlnVfVi=nRln2\Delta S_{\text{one gas}} = n R \ln\dfrac{V_{f}}{V_{i}} = n R \ln 2

For each mole of each gas:

ΔSred=Rln2\Delta S_{\text{red}} = R\ln 2

ΔSblue=Rln2\Delta S_{\text{blue}} = R\ln 2

ΔStotal=2Rln211.53  J/K\Delta S_{\text{total}} = 2 R\ln 2 \approx 11.53\;\text{J/K}

Same answer, completely different route. The combinatorial formula (counting microstates) and the thermodynamic formula (each gas expanding into the full volume) both give 2Rln22R\ln 2. This agreement is not a coincidence — it's a deep consistency check of statistical mechanics, and it's one of the reasons we believe Boltzmann's formula S=kBlnΩS = k_{B}\ln\Omega is fundamentally correct.

The probability of spontaneous unmixing

If entropy only tells us "mixed is more likely than separated," couldn't the system just happen to unmix by random chance? In principle yes. In practice the probability is so small that it never happens in the lifetime of the universe.

The probability of finding all NN particles of one color on their original side (spontaneously unmixing) is:

Psep=ΩseparatedΩtotal1(2NN)P_{\text{sep}} = \dfrac{\Omega_{\text{separated}}}{\Omega_{\text{total}}} \approx \dfrac{1}{\binom{2N}{N}}

For various NN:

  • N=10N = 10: P5.4×106P \approx 5.4 \times 10^{-6} (about 5 in a million)
  • N=20N = 20: P7.3×1012P \approx 7.3 \times 10^{-12} (about 7 in a trillion)
  • N=100N = 100: P1.1×1059P \approx 1.1 \times 10^{-59} (one in 105910^{59} — a nonillion of nonillions)
  • N=1023N = 10^{23} (Avogadro-scale): P106×1022P \approx 10^{-6 \times 10^{22}}

For N=1023N = 10^{23}, if you watched the gas every nanosecond for the entire current age of the universe (1017\approx 10^{17} seconds), you would perform about 102610^{26} observations. The chance of seeing a single spontaneous unmixing is effectively zero to any practical precision.

So the second law isn't strictly absolute — there's a microscopic probability of violation — but that probability is so outrageously small that the distinction is meaningless. The second law is a statistical certainty with more decimal places of reliability than any physical measurement could ever test.

The second law of thermodynamics

There are several equivalent formulations of the second law, but a clean one is:

The total entropy of an isolated system never decreases.

ΔStotal0\Delta S_{\text{total}} \geq 0

For reversible processes (idealized, infinitely slow, no friction), the equality holds: ΔS=0\Delta S = 0. For irreversible processes (everything in the real world), ΔS>0\Delta S > 0. Mixing is a prototypical irreversible process, as is heat flow from hot to cold, friction, diffusion, and every other spontaneous macroscopic phenomenon.

The second law sets the arrow of time in physics. Newton's laws and Maxwell's equations are time-symmetric — they look the same if you run the clock backwards. But the second law is not. It picks out a direction: "entropy is increasing" is the direction we call "forward in time." Why the universe has this preferred direction is a deep question, ultimately connected to the very low-entropy state the universe started in at the Big Bang.

Why this picture works so well

Notice something subtle: we don't need to know anything about the forces between particles, the shapes of the molecules, or any detailed physics. The Boltzmann formula S=kBlnΩS = k_{B}\ln\Omega just counts microstates. It's almost purely a combinatorial/geometric statement about the space of possibilities, not about what any specific particle is doing.

This is why statistical mechanics can make predictions about wildly different systems — gases, magnets, neutron stars, black holes — using the same underlying framework. As long as you can count the microstates (or at least estimate their growth rate with energy), you can get the thermodynamic properties without solving Newton's equations for 102310^{23} particles.

Real-world examples

  • Ink drop in water. Millions of ink molecules in a drop become distributed across trillions of water molecules. The entropy increases by roughly nRln(Vglass/Vdrop)nR\ln(V_{\text{glass}}/V_{\text{drop}}) per mole of ink. For a 1 mL drop in a 1 L glass, that's ln(1000)6.9\ln(1000) \approx 6.9, so ΔS6.9nR\Delta S \approx 6.9 nR — substantial, and spontaneous.
  • Perfume in a room. Initially localized in the bottle, eventually distributed through the room. Same math, larger volume ratio. Every time you open a bottle of perfume, you're performing a huge entropy increase that you'll never see reversed in your lifetime.
  • Heat flowing from hot to cold. A cup of hot coffee on a cold table. Heat QQ flows from the coffee (temperature ThotT_{\text{hot}}) to the table and air (temperature TcoldT_{\text{cold}}). The hot object loses Q/ThotQ/T_{\text{hot}} of entropy; the cold object gains Q/TcoldQ/T_{\text{cold}}. Since Tcold<ThotT_{\text{cold}} < T_{\text{hot}}, the gain is larger than the loss. Net ΔS>0\Delta S > 0. Heat flows the other way — cold to hot — would require ΔS<0\Delta S < 0, which (by the second law) never happens spontaneously.
  • Ice melting. The liquid water at 0°C has more microstates than the frozen ice at 0°C (molecules can translate, not just vibrate around fixed lattice points). At temperatures above freezing, the entropy gain from melting outweighs the energy cost, so melting proceeds spontaneously.
  • Metabolism and life. Living organisms create local decreases in entropy — growing complex structures, organizing molecules, building proteins. But they pay for these decreases by exporting more entropy to the environment, mostly as heat. The total entropy of "organism + environment" always increases. Life exploits the second law; it doesn't violate it.
  • The arrow of time. Why do we remember the past and not the future? Why do broken eggs never reassemble? The modern answer is that the universe started in a very low-entropy state (the hot, dense, uniform Big Bang) and is slowly evolving toward higher entropy. Our psychological experience of time moving "forward" is aligned with the direction of entropy increase.

Gibbs paradox and identical particles

Here's a subtle puzzle: what happens if the two "gases" being mixed are actually the same gas? Say you remove a barrier between two halves of a box, both containing pure oxygen at the same temperature and pressure. Does the entropy increase?

Classically, if you treated the O₂ molecules as distinguishable, the formula would say yes — ΔS=2Rln2\Delta S = 2R\ln 2 just like mixing red and blue gases. But physically, nothing observable has changed by removing the barrier. The gas looks identical before and after. There's no way to extract work from the "mixing" or to un-mix by any measurement.

The resolution — due to Gibbs — is that identical particles are truly indistinguishable. Swapping two oxygen molecules doesn't produce a new microstate; it's literally the same microstate because there's no way, even in principle, to tell which molecule is which. When you count microstates correctly for identical particles (which requires quantum mechanical statistics), the "entropy of mixing" for two samples of the same gas is zero, as it must be physically.

This is called the Gibbs paradox and its resolution was one of the early hints that classical physics wasn't complete — that particles are fundamentally quantum objects. For distinguishable particles (different gases), the mixing entropy is real and measurable. For indistinguishable particles (the same gas), it's zero.

Common mistakes

  • Calling entropy "disorder." This is a popular simplification but it can be misleading. Entropy is really about the number of microstates consistent with a macrostate, not about subjective "messiness." A well-organized filing cabinet has low entropy not because it's neat in a human-aesthetic sense but because most of its molecules are constrained to specific positions by the structure of the cellulose and ink.
  • Thinking entropy "prefers" disorder. The second law doesn't enforce anything or "push" systems toward anything. It's a purely statistical statement: a random microstate is much more likely to be in a high-entropy macrostate because there are many more such microstates. No mysterious force — just combinatorics.
  • Thinking entropy can't decrease locally. It can and does. Refrigerators decrease the entropy of their contents. Your body decreases its local entropy (growing, organizing). The second law only says the total entropy of an isolated system can't decrease. A local decrease is always paid for by a larger increase elsewhere.
  • Confusing entropy with heat. They're related but distinct. Entropy has units of J/K (energy per temperature). Heat has units of J. The relation ΔS=Q/T\Delta S = Q/T (for reversible heat transfer at temperature TT) connects them. Don't call heat "entropy."
  • Treating S=kBlnΩS = k_{B}\ln\Omega as only applying to ideal gases. The formula is universal. It applies to any system where you can count (or estimate) the number of accessible microstates: solids, liquids, magnets, neutron stars, black holes. The formula is agnostic about what the system is.
  • Assuming entropy is subjective. Some philosophical writing suggests that entropy depends on what the observer knows or cares about. In classical thermodynamics, there's a more objective definition: the microstates are real, and the count is a fact about the physical system, not about the observer's knowledge. (There's a more nuanced information-theoretic version where "subjective" plays a role, but that's a different conversation.)

Try it

  • Slide the number of red and blue particles. At 10 particles per color you can almost count the microstates; at 100 the combinatorics are astronomical.
  • Toggle the barrier. With the barrier in place, each gas is confined to its half and the entropy is at its initial value. Remove the barrier and the entropy jumps up to the mixed-state value and stays there — the system never spontaneously returns.
  • Watch the entropy bar on the right side of the screen. It shows the current entropy relative to the initial separated state. Closed barrier: zero. Open barrier: 2Rln211.52R\ln 2 \approx 11.5 J/K for 1 mole of each gas.
  • Enable "Show probability of unmixing" to see the numerical probability of spontaneous separation at the current particle count. Even for just 20 particles, the probability is already tiny. For Avogadro-scale numbers, it's effectively zero.
  • Enable the microstate histogram to see the binomial distribution of microstates by "how many reds are on the left." Notice how sharply peaked it is at the middle — that's why the system strongly prefers 50/50 splits.
  • Try very unequal particle counts (say 5 reds vs 95 blues). The mixing entropy formula changes — see whether you can guess the pattern before the display shows it.
  • Try exactly equal numbers (50 red, 50 blue). The peak of the distribution is sharpest, and the system is most committed to the mixed state.

Interactive Visualization

Parameters

40.00
40.00
300.00
1.00
Your turn

Got your own math or physics problem?

Turn any problem into an interactive visualization like this one — powered by AI, generated in seconds. Free to try, no credit card required.

Sign Up Free to Try It30 free visualizations every day
Entropy: Mixing Gases and the Second Law | MathSpin