Module VI·Article I·~2 min read
Tensor Product and Tensors
Tensor Algebra
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
What is a Tensor
A tensor is a multidimensional array of numbers that transforms in a “proper way” under a change of basis. Scalars are tensors of rank 0, vectors are tensors of rank 1, matrices are tensors of rank 2.
Multilinear form — a function $f: V_1 \times \ldots \times V_k \to F$, linear in each argument. A tensor of type $(p, q)$ has $p$ contravariant and $q$ covariant indices.
Tensor Product
If $V$ and $W$ are vector spaces, their tensor product $V \otimes W$ is a new vector space of dimension $\dim V \cdot \dim W$.
Elements: $v \otimes w$ (simple tensors) and their linear combinations.
If ${e_i}$ is a basis of $V$ and ${f_j}$ is a basis of $W$, then ${e_i \otimes f_j}$ is a basis of $V \otimes W$.
Tensor product of operators: $(A \otimes B)(v \otimes w) = Av \otimes Bw$.
Tensors in Physics
Stress tensor $\sigma_{ij}$ — force per unit area in the $j$-direction on a plane with normal $i$. Symmetric tensor of rank 2.
Metric tensor $g_{ij}$ in Riemannian geometry defines distances and angles. In special relativity: $g = \mathrm{diag}(-1, 1, 1, 1)$ (Minkowski metric).
Riemann curvature tensor $R^i_{jkl}$ — a tensor of rank $(1,3)$, describes the curvature of space in general relativity.
Contraction of Tensors
Contraction over a pair of indices generalizes the matrix trace. The tensor $T^i{}_j$ contracts to a scalar $T = T^i{}_i = \sum_i T^i{}_i$.
Matrix multiplication $C_{ik} = \sum_j A_{ij}B_{jk}$ is contraction over $j$. In Einstein notation (summation convention): $C_{ik} = A_{ij}B^j{}_k$.
Symmetric and Skew-Symmetric Tensors
A tensor $T$ is symmetric in the pair of indices $i,j$ if $T_{ij\ldots} = T_{ji\ldots}$. It is skew-symmetric (antisymmetric) if $T_{ij\ldots} = -T_{ji\ldots}$.
Exterior forms (differential forms) are skew-symmetric tensors. Exterior product $dx \wedge dy = dx \otimes dy - dy \otimes dx$.
§ Act · what next