Module V·Article III·~1 min read
The Spectral Theorem and Its Applications
Euclidean and Unitary Spaces
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
General Spectral Theorem
For a normal operator $A$ ($AA^* = A^*A$) on a finite-dimensional unitary space: there exists an orthonormal basis of eigenvectors. Normal operators include Hermitian, skew-Hermitian, and unitary operators.
This is a unified "framework" result for the most important classes of operators.
Functions of Operators
If $A = QDQ^{-1}$ ($Q$ is orthogonal/unitary), then $f(A) = Q f(D) Q^{-1} = Q \operatorname{diag}(f(\lambda_1),...,f(\lambda_n)) Q^{-1}$.
This is an elegant way to compute: matrix roots ($f(t) = \sqrt{t}$), exponentials ($f(t) = e^t$), logarithms, and other functions.
The square root of a matrix: if $A \geq 0$ (all $\lambda_i \geq 0$), then $\sqrt{A} = Q \operatorname{diag}(\sqrt{\lambda_1},...) Q^\top$.
Courant–Fischer Principle
For a symmetric matrix $A$ with eigenvalues $\lambda_1 \leq \lambda_2 \leq \ldots \leq \lambda_n$:
$ \lambda_k = \min_{\dim U = k} \max_{v\in U,, |v|=1} (Av, v). $
This is a minimax characterization of eigenvalues—used in variational problems and in estimating the spectrum of perturbed operators.
Applications of SVD
Recommendation systems: user × movie matrix → SVD → latent factors (genres, styles).
Image compression: image = pixel matrix → SVD → store the first $k$ ranks. For $k=50$ out of $1000$, quality is often
gt; 95%$ at file size lt; 10%$.Text analysis (LSA): document × term matrix → SVD → latent semantic axes.
Pseudoinverse matrix: $A^+ = V\Sigma^+ U^\top$ ($\sigma_i^+ = 1/\sigma_i$ for $\sigma_i \neq 0$). Solves least squares: $x = A^+b$.
§ Act · what next