Module III·Article III·~1 min read
Eigenvalues and Eigenvectors
Vector Spaces
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Special Directions
What happens to most vectors when multiplied by a matrix? They change both their length and direction. But some vectors change only their length — remaining on the same line. These special vectors are fundamental.
An eigenvector of a matrix A is a nonzero vector v such that Av = λv, where λ is an eigenvalue.
Physically: an eigenvector is an "invariant direction" of the transformation. Scaling without rotation.
Characteristic Polynomial
Av = λv ⟺ (A − λE)v = 0 ⟺ det(A − λE) = 0.
Characteristic polynomial: $p_A(\lambda) = \det(A - \lambda E)$ — a degree n polynomial in λ.
Eigenvalues are roots of $p_A(\lambda)$. By the fundamental theorem of algebra over ℂ, there are exactly n (counted with multiplicity).
For a real symmetric matrix, all eigenvalues are real.
Diagonalization
A matrix A is diagonalizable if there exists an invertible matrix P such that $P^{-1}AP = \operatorname{diag}(\lambda_1, ..., \lambda_n)$. The columns of P are the eigenvectors, the diagonal elements are the eigenvalues.
A is diagonalizable ⟺ A has n linearly independent eigenvectors.
A sufficient condition: n distinct eigenvalues.
Applications
Exponentiation: if $A = PDP^{-1}$, then $A^k = PD^kP^{-1}$. $D^k$ is diagonal: we simply raise the diagonal elements to the k-th power.
Power: solves linear recurrences, systems of ODEs, Markov chains.
Fibonacci numbers: $F(n+1) = F(n) + F(n-1)$. Matrix $[[1,1],[1,0]]^n$ gives Binet's formula via eigenvalues (golden ratio φ = (1+√5)/2).
Principal Component Analysis (PCA): eigenvectors of the covariance matrix are principal components of the data.
§ Act · what next