Module V·Article II·~5 min read
Lyapunov Functions and Stability Criteria
Phase Portrait and Critical Points
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Stability Criteria: From Routh to Lyapunov
Why Algebraic Criteria Are Needed
The eigenvalues of matrix A determine stability. But for a high-order matrix, calculating the roots of the characteristic polynomial is a labor-intensive task. In the 18th–19th centuries, mathematicians searched for algebraic criteria allowing one to check stability without explicitly finding the roots.
Historical problem: A steam engine regulator designer needs to know whether the engine will be stable for given parameters. He cannot solve a sixth-order polynomial equation. The Routh (1877) and Hurwitz (1895) criteria provide an answer in the form of several inequalities for the coefficients of the polynomial.
The Routh–Hurwitz Criterion
For the polynomial $P(\lambda) = \lambda^n + a_{n-1} \lambda^{n-1} + \dots + a_1 \lambda + a_0$, we construct the Hurwitz matrix:
$ H_n = \begin{bmatrix} a_{n-1} & a_{n-3} & a_{n-5} & \dots \ 1 & a_{n-2} & a_{n-4} & \dots \ 0 & a_{n-1} & a_{n-3} & \dots \ \vdots & \vdots & \vdots & \ddots \end{bmatrix} $
Hurwitz’s theorem: All roots of $P(\lambda)$ have negative real parts (the system is stable) if and only if all the principal minors $\Delta_1, \Delta_2, ..., \Delta_n$ of the matrix $H_n$ are positive.
For n = 2: $P = \lambda^2 + a_1\lambda + a_0$. Conditions: $a_1 > 0$ and $a_0 > 0$.
For n = 3: $P = \lambda^3 + a_2\lambda^2 + a_1\lambda + a_0$. Conditions: $a_2 > 0$, $a_0 > 0$, and $a_2 a_1 > a_0$.
Example: $P(\lambda) = \lambda^3 + 3\lambda^2 + 2\lambda + 1$. $a_2 = 3 > 0$ ✓, $a_0 = 1 > 0$ ✓, $a_2 a_1 = 3 \cdot 2 = 6 > 1 = a_0$ ✓. All conditions are satisfied → all roots are stable.
Routh Table: Algorithmic Form
The Routh method is a more convenient algorithm (especially when computing by hand). We construct a table:
Row 1: $\lambda^n$, $a_{n-2}$, $a_{n-4}$, ... Row 2: $a_{n-1}$, $a_{n-3}$, $a_{n-5}$, ... Rows 3–(n+1): each entry = $(b \times \text{element above-left} - \text{element above} \times \text{element to the left}) / b$.
The system is stable if and only if all elements of the first column of the table have the same sign. The number of sign changes in the first column equals the number of roots in the right half-plane.
Lyapunov’s Theorem on Linearization
The direct Lyapunov method is applied to nonlinear systems. But for hyperbolic equilibria, there is an elegant connection with linearization.
Theorem (Lyapunov, 1892): Consider $\mathbf{x}' = A\mathbf{x} + g(\mathbf{x})$, where $|g(\mathbf{x})| = o(|\mathbf{x}|)$ (the nonlinear terms vanish faster than the linear ones). Then:
- If all eigenvalues of $A$ have negative real parts → $x^* = 0$ is asymptotically stable for the nonlinear system.
- If at least one eigenvalue has a positive real part → $x^* = 0$ is unstable for the nonlinear system.
- If the maximum real part of the eigenvalues is $0$ → linearization says nothing; nonlinear analysis is needed.
Physical meaning: “Small nonlinearity” does not affect the type of stability of hyperbolic equilibria. That is why linear models work near stable operating points.
The Critical Case: Marginal Stability
Lyapunov’s theorem on linearization “stays silent” for marginal eigenvalues. Let us consider two examples:
Example 1: $\dot{x} = -x^3$. Linearization: $f'(0) = 0$. Original system: $V = x^2/2$, $\dot{V} = x\dot{x} = -x^4 < 0$. Asymptotically stable — despite neutral linearization.
Example 2: $\dot{x} = x^3$. Linearization: $f'(0) = 0$. $V = -x^2/2$, $\dot{V} = -x^4 < 0$ for $V$, so $x$ grows. Unstable — though linearization is the same.
Conclusion: when the real parts of eigenvalues are zero, higher-order nonlinear terms are fundamentally important.
Stability of Systems with Time-Varying Coefficients: Perron’s Paradox
For systems with constant $A$, it is sufficient that $\operatorname{Re} \lambda_i < 0$ for stability. For systems $x' = A(t)x$ with time-varying coefficients, this is not so!
Perron’s example: $A(t) = \begin{bmatrix} -1 + 1.5 \cos^2 t & 1 - 1.5 \sin t \cos t \ -1 - 1.5 \sin t \cos t & -1 + 1.5 \sin^2 t \end{bmatrix}$. Instantaneous eigenvalues: $\operatorname{Re} \lambda = -0.25 < 0$ always. But the solution of the system: $x(t) = e^{0.5t} (\cos t, -\sin t)^\mathsf{T}$ (grows exponentially!).
Lyapunov exponents for time-varying coefficient systems: $\sigma_i = \lim_{T \to \infty} \frac{1}{T} \ln|x_i(T)|$. The system is stable if and only if all $\sigma_i < 0$. Calculating Lyapunov exponents for nonlinear systems is an important task in chaos theory.
Question to ponder: How can a control systems designer use the Routh–Hurwitz criterion to select PID controller parameters? Why is this criterion preferable to directly computing eigenvalues at the initial design stage?
The Nyquist Criterion and Stability Margin
The Nyquist criterion (frequency-based) makes it possible to assess the stability of a closed-loop system using the open-loop Nyquist plot — without calculating the roots of the characteristic equation. Key concept: phase margin and gain margin. If the Nyquist plot does not encircle the point $(-1, 0)$ — the system is stable. Phase margin $PM \geq 45^\circ$ is a practical standard in the design of aviation and robotics systems. Connection with frequency analysis: the peak value of the closed transfer function $|H(j\omega)|_{\max}$ is related to the stability margin: the smaller PM, the higher the peak, the more “oscillatory” the transient process is.
§ Act · what next