Singular Value Decomposition (SVD)

April 13, 2026

Problem

Decompose A = [[3,2],[2,3]] as A = UΣVᵀ. Show how the singular values reveal the geometry of the transformation.

Explanation

The SVD statement

Every m×nm \times n real matrix AA has a decomposition A=UΣVTA = U \Sigma V^T

where

  • UU is m×mm \times m orthogonal (columns = left singular vectors),
  • Σ\Sigma is m×nm \times n diagonal with non-negative entries (singular values σ1σ20\sigma_1 \ge \sigma_2 \ge \cdots \ge 0),
  • VV is n×nn \times n orthogonal (columns = right singular vectors).

SVD exists for any matrix — no invertibility, no squareness, no symmetry required.

Step-by-step

A=(3223)A = \begin{pmatrix} 3 & 2 \\ 2 & 3 \end{pmatrix} (symmetric — SVD simplifies).

Step 1 — ATAA^T A. ATA=(3223)(3223)=(13121213)A^T A = \begin{pmatrix} 3 & 2 \\ 2 & 3 \end{pmatrix} \begin{pmatrix} 3 & 2 \\ 2 & 3 \end{pmatrix} = \begin{pmatrix} 13 & 12 \\ 12 & 13 \end{pmatrix}

Step 2 — Eigenvalues of ATAA^T A.

Characteristic polynomial: (13λ)2144=λ226λ+169144=λ226λ+25(13 - \lambda)^2 - 144 = \lambda^2 - 26 \lambda + 169 - 144 = \lambda^2 - 26 \lambda + 25.

λ=26±6761002=26±242=25,1\lambda = \dfrac{26 \pm \sqrt{676 - 100}}{2} = \dfrac{26 \pm 24}{2} = 25, 1.

Step 3 — Singular values.

σi=λi\sigma_i = \sqrt{\lambda_i}: σ1=5,σ2=1\sigma_1 = 5, \quad \sigma_2 = 1

Σ=(5001)\Sigma = \begin{pmatrix} 5 & 0 \\ 0 & 1 \end{pmatrix}

Step 4 — Right singular vectors (eigenvectors of ATAA^T A).

λ=25\lambda = 25: (ATA25I)v=0(A^T A - 25 I) \mathbf{v} = \mathbf{0}: (12121212)v=0    v=(1,1)T/2\begin{pmatrix} -12 & 12 \\ 12 & -12 \end{pmatrix} \mathbf{v} = \mathbf{0} \implies \mathbf{v} = (1, 1)^T / \sqrt{2}

λ=1\lambda = 1: (ATAI)v=0(A^T A - I) \mathbf{v} = \mathbf{0}: (12121212)v=0    v=(1,1)T/2\begin{pmatrix} 12 & 12 \\ 12 & 12 \end{pmatrix} \mathbf{v} = \mathbf{0} \implies \mathbf{v} = (1, -1)^T / \sqrt{2}

V=12(1111)V = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}

Step 5 — Left singular vectors: ui=Avi/σi\mathbf{u}_i = A \mathbf{v}_i / \sigma_i.

u1=Av1/5=1512(55)=12(11)\mathbf{u}_1 = A \mathbf{v}_1 / 5 = \dfrac{1}{5} \cdot \dfrac{1}{\sqrt{2}} \begin{pmatrix} 5 \\ 5 \end{pmatrix} = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 1 \\ 1 \end{pmatrix}

u2=Av2/1=12(11)\mathbf{u}_2 = A \mathbf{v}_2 / 1 = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 1 \\ -1 \end{pmatrix}

U=12(1111)U = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}

(Same as VV here because AA is symmetric positive-definite.)

Verification: compute UΣVTU \Sigma V^T — will reproduce AA exactly.

UΣ=12(5151),UΣVT=12(5151)(1111)=12(6446)=(3223)U \Sigma = \dfrac{1}{\sqrt{2}} \begin{pmatrix} 5 & 1 \\ 5 & -1 \end{pmatrix}, \quad U \Sigma V^T = \dfrac{1}{2} \begin{pmatrix} 5 & 1 \\ 5 & -1 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix} = \dfrac{1}{2} \begin{pmatrix} 6 & 4 \\ 4 & 6 \end{pmatrix} = \begin{pmatrix} 3 & 2 \\ 2 & 3 \end{pmatrix} ✓

Geometric interpretation

Any linear map decomposes as:

  1. Rotate input via VTV^T (align to principal axes).
  2. Stretch axis-by-axis by σi\sigma_i (the singular values).
  3. Rotate output via UU (align to output directions).

For our matrix: the unit circle, passed through AA, becomes an ellipse. The semi-axes are σ1=5\sigma_1 = 5 (long) and σ2=1\sigma_2 = 1 (short), aligned with u1\mathbf{u}_1 and u2\mathbf{u}_2 respectively.

Why SVD is the most important decomposition in linear algebra

  • Exists for every matrix — no structural assumptions needed.
  • Reveals rank: rank(A)\operatorname{rank}(A) = number of non-zero singular values.
  • Best low-rank approximation (Eckart-Young): truncating the SVD at kk terms gives the optimal rank-kk approximation in both Frobenius and spectral norm.
  • Pseudoinverse: A+=VΣ+UTA^+ = V \Sigma^+ U^T with Σ+\Sigma^+ inverting non-zero singular values. Solves least-squares even when ATAA^T A is singular.
  • Condition number: σ1/σn\sigma_1 / \sigma_n measures sensitivity of linear systems to perturbations.
  • Underlies PCA, latent-semantic analysis, recommender systems, denoising, and compression.

Compact/thin SVD

For an m×nm \times n matrix of rank rr, the thin SVD keeps only the first rr singular values: A=UrΣrVrTA = U_r \Sigma_r V_r^T

where UrU_r is m×rm \times r, Σr\Sigma_r is r×rr \times r, VrV_r is n×rn \times r. Stores and computes with far fewer numbers when rmin(m,n)r \ll \min(m, n).

Common mistakes

  • Taking square roots of negatives. Singular values are always non-negative (unlike eigenvalues). They come from eigenvalues of ATAA^T A, which is positive semi-definite.
  • Mixing up UU and VV. UU's columns span the column space; VV's span the row space.
  • Confusing SVD with eigendecomposition. SVD is for arbitrary matrices; eigendecomposition requires diagonalizability. For symmetric matrices they're closely related.

Try it in the visualization

Apply AA to a unit circle, producing an ellipse. Display UU, Σ\Sigma, VTV^T as three sequential transformations: the circle rotates, stretches, and rotates again. Semi-axis lengths are the singular values; the major axis direction is u1\mathbf{u}_1.

Interactive Visualization

Parameters

3.00
2.00
2.00
3.00
Your turn

Got your own math or physics problem?

Turn any problem into an interactive visualization like this one — powered by AI, generated in seconds. Free to try, no credit card required.

Sign Up Free to Try It30 free visualizations every day
Singular Value Decomposition (SVD) | MathSpin