Module V·Article I·~2 min read
Euclidean Space and Orthogonalization
Euclidean and Unitary Spaces
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Scalar Product
Let us add to a vector space the notions of length and angle — the scalar product.
Euclidean space is a real vector space $V$ with a scalar product $(\cdot, \cdot): V \times V \to \mathbb{R}$ satisfying:
- Symmetry: $(u, v) = (v, u)$
- Bilinearity in both arguments
- Positive definiteness: $(v, v) \geq 0$ and $(v, v) = 0 \iff v = 0$
Norm: $|v| = \sqrt{(v, v)}$. Angle: $\cos \theta = \frac{(u, v)}{|u| \cdot |v|}$.
Cauchy–Bunyakovsky inequality: $|(u, v)| \leq |u| \cdot |v|$.
Orthogonality
Vectors $u, v$ are orthogonal ($u \perp v$) if $(u, v) = 0$.
Pythagoras’ theorem: if $u \perp v$, then $|u + v|^2 = |u|^2 + |v|^2$.
Orthonormal basis: $(e_i, e_j) = \delta_{ij}$ (Kronecker delta). In an orthonormal basis: $(u, v) = \sum_i u_i v_i$ (the standard scalar product in $\mathbb{R}^n$).
Gram–Schmidt Orthogonalization Process
Given linearly independent $v_1, ..., v_n$, we build an orthonormal basis $e_1, ..., e_n$:
$e_1 = \frac{v_1}{|v_1|}$.
$u_2 = v_2 - (v_2, e_1)e_1; \quad e_2 = \frac{u_2}{|u_2|}$.
$u_k = v_k - \sum_{i=1}^{k-1} (v_k, e_i)e_i; \quad e_k = \frac{u_k}{|u_k|}$.
Geometrically: each new vector is the original minus its projection onto the subspace already constructed.
QR decomposition: $V = QR$, where $Q$ is an orthogonal matrix (columns $e_i$), $R$ is upper triangular. This is the Gram–Schmidt algorithm in matrix form; it is used in numerical methods and least squares.
Orthogonal Complements and Projection
The orthogonal complement of a subspace $U$: $U^\perp = {v: (v, u) = 0 \ \forall u \in U}$.
$V = U \oplus U^\perp$ — direct sum (for finite-dimensional Euclidean spaces).
Projection of $v$ onto $U$: $P_U v$ is the unique vector in $U$ closest to $v$. $|v - P_U v| = \min{|v - u|: u \in U}$.
Least squares method: $Ax = b$ does not have an exact solution → we seek $x^$ that minimizes $|Ax - b|^2$. Solution: $A^{\mathrm{T}} A x^ = A^{\mathrm{T}} b$ (normal equations). $x^*$ is the projection of $b$ onto $\mathrm{Im} , A$.
§ Act · what next