Symmetric Matrices and the Spectral Theorem

April 13, 2026

Problem

Diagonalize the symmetric matrix A = [[2,1],[1,2]] with orthogonal eigenvectors. Show the eigendirections are perpendicular.

Explanation

The spectral theorem (real, symmetric version)

If AA is a real symmetric matrix (AT=AA^T = A), then:

  1. AA has nn real eigenvalues (counted with algebraic multiplicity).
  2. Eigenvectors for distinct eigenvalues are orthogonal.
  3. AA can be diagonalized by an orthogonal matrix QQ: A=QDQTA = Q D Q^T

with QTQ=IQ^T Q = I and DD diagonal with the eigenvalues.

This is sometimes called the orthogonal diagonalization theorem.

Step-by-step

A=(2112)A = \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix} — symmetric.

Step 1 — Eigenvalues.

Characteristic polynomial: (2λ)21=λ24λ+3=(λ3)(λ1)(2 - \lambda)^2 - 1 = \lambda^2 - 4 \lambda + 3 = (\lambda - 3)(\lambda - 1). λ1=3,λ2=1\lambda_1 = 3, \quad \lambda_2 = 1

Both real (as the theorem promises). ✓

Step 2 — Eigenvectors.

For λ=3\lambda = 3: (A3I)v=0    (1111)v=0(A - 3I) \mathbf{v} = 0 \implies \begin{pmatrix} -1 & 1 \\ 1 & -1 \end{pmatrix} \mathbf{v} = 0. Solution: v1=(1,1)T\mathbf{v}_1 = (1, 1)^T.

For λ=1\lambda = 1: (AI)v=0    (1111)v=0(A - I) \mathbf{v} = 0 \implies \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} \mathbf{v} = 0. Solution: v2=(1,1)T\mathbf{v}_2 = (1, -1)^T.

Check orthogonality: v1v2=11+1(1)=0\mathbf{v}_1 \cdot \mathbf{v}_2 = 1 \cdot 1 + 1 \cdot (-1) = 0 ✓ (Exactly as the theorem guarantees.)

Step 3 — Normalize.

v1=2\|\mathbf{v}_1\| = \sqrt{2}, v2=2\|\mathbf{v}_2\| = \sqrt{2}: q1=12(1,1)T,q2=12(1,1)T\mathbf{q}_1 = \dfrac{1}{\sqrt{2}} (1, 1)^T, \quad \mathbf{q}_2 = \dfrac{1}{\sqrt{2}} (1, -1)^T

Step 4 — Assemble. Q=12(1111),D=(3001)Q = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}, \quad D = \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix}

QQ is orthogonal (QTQ=IQ^T Q = I), so Q1=QTQ^{-1} = Q^T.

Step 5 — Verify A=QDQTA = Q D Q^T.

DQT=12(3311)D Q^T = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 3 & 3 \\ 1 & -1 \end{pmatrix}.

QDQT=12(1111)12(3311)=12(4224)=(2122)Q D Q^T = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix} \cdot \dfrac{1}{\sqrt{2}} \begin{pmatrix} 3 & 3 \\ 1 & -1 \end{pmatrix} = \dfrac{1}{2} \begin{pmatrix} 4 & 2 \\ 2 & 4 \end{pmatrix} = \begin{pmatrix} 2 & 1 \\ 2 & 2 \end{pmatrix} ✓

Wait — I get (2,1;2,2)(2, 1; 2, 2), not (2,1;1,2)(2, 1; 1, 2)? Let me recompute column 1 of QDQTQDQ^T: row 2 of QDQD is 12(3,1)\dfrac{1}{\sqrt 2}(3, -1); dotted with column 1 of QTQ^T which is 12(1,1)\dfrac{1}{\sqrt 2}(1, 1): 12(3+(1))=1\dfrac{1}{2}(3 + (-1)) = 1. So it's (2,1;1,2)=A(2, 1; 1, 2) = A ✓. Transcription verification lesson — always double-check the dot product.

Why symmetric matrices are special

  1. Real eigenvalues guaranteed. Non-symmetric matrices can have complex eigenvalues; symmetric ones can't.
  2. Orthogonal eigenvectors. Even across repeated eigenvalues, you can always choose an orthogonal eigenvector basis.
  3. Always diagonalizable. Never defective.
  4. Geometric structure. The level sets of xTAx\mathbf{x}^T A \mathbf{x} are aligned with the eigenvectors — the principal axes.

This is the "cleanest" diagonalization scenario in all of linear algebra.

Applications

  • Principal Component Analysis (PCA): diagonalize the sample covariance matrix; its (orthogonal) eigenvectors are the principal components.
  • Quadratic forms: xTAx\mathbf{x}^T A \mathbf{x} simplifies to λ1y12+λ2y22+\lambda_1 y_1^2 + \lambda_2 y_2^2 + \cdots in the eigenvector coordinate system.
  • Rigid-body mechanics: the inertia tensor is symmetric; its eigenvectors give the principal axes of rotation.
  • Fourier analysis: the discrete Fourier transform diagonalizes circulant matrices; the continuous version diagonalizes translation-invariant operators.
  • Spectral graph theory: the adjacency matrix and Laplacian are symmetric; their eigenvectors encode graph structure.

Extension: Hermitian matrices

Over C\mathbb{C}, the analogue is Hermitian matrices (A=AA^* = A where * denotes conjugate transpose). Same conclusions: real eigenvalues, unitary diagonalization A=UDUA = U D U^*.

Generalization: normal matrices

The spectral theorem applies most broadly to normal matrices (AA=AAA A^* = A^* A) over C\mathbb{C}: they're unitarily diagonalizable. This class includes Hermitian, skew-Hermitian, and unitary matrices.

Common mistakes

  • Forgetting to normalize. Orthogonality isn't enough; QQ must have unit columns to be orthogonal as a matrix.
  • Using PP (general) instead of QQ (orthogonal). You can diagonalize as A=PDP1A = PDP^{-1} with non-orthogonal eigenvectors, but the whole point of the spectral theorem is that symmetric matrices admit orthogonal diagonalization — which is cheaper and numerically stable.
  • Assuming repeated eigenvalues cause defects. Symmetric matrices are still diagonalizable even with repeated eigenvalues; you just pick an orthonormal basis for the eigenspace.

Try it in the visualization

The matrix acts on the unit circle, producing an ellipse; the eigenvectors emerge as the semi-major and semi-minor axes. Sliders move the off-diagonal entry; eigen-directions rotate in response while always staying perpendicular — confirming the spectral theorem.

Interactive Visualization

Parameters

2.00
1.00
2.00
Your turn

Got your own math or physics problem?

Turn any problem into an interactive visualization like this one — powered by AI, generated in seconds. Free to try, no credit card required.

Sign Up Free to Try It30 free visualizations every day
Symmetric Matrices and the Spectral Theorem | MathSpin