Markov Chains: State Diagrams and Steady State

April 13, 2026

Problem

Weather model: if sunny today, P(sunny tomorrow) = 0.8. If rainy today, P(sunny tomorrow) = 0.4. Draw the state diagram and find the long-run probability of sunny weather.

Explanation

What is a Markov chain?

A Markov chain is a random process that moves between a finite set of states step by step, where the next state depends only on the current state — not the full history. That "memoryless" property is called the Markov property.

The chain is fully described by a transition matrix PP, where Pij=P(next state=jcurrent state=i)P_{ij} = P(\text{next state} = j \mid \text{current state} = i). Each row sums to 1 (some next-state must happen).

The weather model

States: S (sunny), R (rainy).

| From \ To | S | R | |-----------|-----|-----| | S | 0.8 | 0.2 | | R | 0.4 | 0.6 |

So P=(0.80.20.40.6)P = \begin{pmatrix} 0.8 & 0.2 \\ 0.4 & 0.6 \end{pmatrix}.

(Since tables do not render here, read it as: PSS=0.8P_{SS} = 0.8, PSR=0.2P_{SR} = 0.2, PRS=0.4P_{RS} = 0.4, PRR=0.6P_{RR} = 0.6.)

Step-by-step — steady-state probabilities

The steady-state (or stationary) distribution π=(πS,πR)\pi = (\pi_S, \pi_R) satisfies πP=π\pi P = \pi and πS+πR=1\pi_S + \pi_R = 1. It's the long-run fraction of days spent in each state.

Step 1 — Write the balance equations. For each state, the flow in equals the flow out: πS=0.8πS+0.4πR\pi_S = 0.8 \, \pi_S + 0.4 \, \pi_R πR=0.2πS+0.6πR\pi_R = 0.2 \, \pi_S + 0.6 \, \pi_R

Step 2 — Simplify. Rearranging the first equation: 0.2πS=0.4πR    πS=2πR0.2 \, \pi_S = 0.4 \, \pi_R \implies \pi_S = 2\pi_R

Step 3 — Apply πS+πR=1\pi_S + \pi_R = 1: 2πR+πR=1    πR=13,πS=232\pi_R + \pi_R = 1 \implies \pi_R = \tfrac{1}{3}, \quad \pi_S = \tfrac{2}{3}

So in the long run, πS=2/3,  πR=1/3\boxed{\pi_S = 2/3, \; \pi_R = 1/3} — about 67% of days are sunny.

Two-step transition

P(sunny in 2 dayssunny today)=PSS2+PSRPRS=0.82+0.20.4=0.64+0.08=0.72P(\text{sunny in 2 days} \mid \text{sunny today}) = P_{SS}^2 + P_{SR} \, P_{RS} = 0.8^2 + 0.2 \cdot 0.4 = 0.64 + 0.08 = 0.72. Equivalently, compute the (S,S)(S, S) entry of P2P^2.

Verification

Plug π\pi back into πP\pi P:

  • (2/3)(0.8)+(1/3)(0.4)=8/15+2/15=10/15=2/3(2/3)(0.8) + (1/3)(0.4) = 8/15 + 2/15 = 10/15 = 2/3 \checkmark
  • (2/3)(0.2)+(1/3)(0.6)=4/30+6/30=10/30=1/3(2/3)(0.2) + (1/3)(0.6) = 4/30 + 6/30 = 10/30 = 1/3 \checkmark

Common mistakes

  • Forgetting rows sum to 1. Every row is a probability distribution over next states.
  • Confusing "state" with "observation." A hidden-Markov model has hidden states and observations; a plain Markov chain's state is directly observable.
  • Assuming a unique steady state. You need the chain to be irreducible (every state reachable from every other) and aperiodic. Our weather chain is both.

Try it in the visualization

Adjust the two transition probabilities and watch the state diagram, the PP matrix, and the steady-state bar chart all update together. Step the chain one day at a time to watch it converge toward π\pi.

Interactive Visualization

Parameters

0.80
0.40
Sunny
40.00
4.00
Your turn

Got your own math or physics problem?

Turn any problem into an interactive visualization like this one — powered by AI, generated in seconds. Free to try, no credit card required.

Sign Up Free to Try It30 free visualizations every day
Markov Chains: State Diagrams and Steady State | MathSpin