Module II·Article III·~4 min read
Theorems on the Mean and Local Properties of Functions
Single-Variable Functions: Limit and Continuity
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Theorems on the Mean as a Bridge
Theorems on the mean value connect the local properties of a function (the values of the derivative at a point) with the global ones (the change of the function on an interval). They are the workhorses of mathematical analysis, used in proofs of hundreds of other results.
Rolle's Theorem
If $f$ is continuous on $[a, b]$, differentiable on $(a, b)$, and $f(a) = f(b)$, then there exists a point $c \in (a, b)$ such that $f'(c) = 0$.
Geometric meaning: if the function returns to its starting point, at some moment it "turns around"—the derivative becomes zero.
Proof: by Weierstrass's theorem, $f$ attains a maximum and a minimum on $[a, b]$. If both are at the endpoints, $f \equiv \mathrm{const}$ and $f' \equiv 0$. Otherwise, at least one internal extremum—at it, the derivative is zero.
Lagrange's Theorem (Theorem on Finite Differences)
If $f$ is continuous on $[a, b]$ and differentiable on $(a, b)$, then there exists $c \in (a, b)$ such that $f(b) - f(a) = f'(c) \cdot (b - a)$.
Equivalently: $(f(b) - f(a))/(b - a) = f'(c)$. The average rate of change equals the instantaneous rate at some moment.
Physical meaning: if a car travels 100 km in two hours (average speed 50 km/h), then at some moment its instantaneous speed was exactly 50 km/h.
Corollary 1: If $f'(x) = 0$ on the interval, then $f = \mathrm{const}$.
Corollary 2: If $f'(x) > 0$ on the interval, then $f$ is increasing. If $f'(x) < 0$—it is decreasing.
This is the foundation for studying functions for monotonicity.
Cauchy's Theorem (Generalized Mean Value Theorem)
If $f$ and $g$ are continuous on $[a, b]$, differentiable on $(a, b)$, $g'(x) \neq 0$ on $(a, b)$, then there exists $c \in (a, b)$ such that $(f(b) - f(a))/(g(b) - g(a)) = f'(c)/g'(c)$.
This generalizes Lagrange's theorem (for $g(x) = x$ we recover it). Cauchy's theorem is a key step in the proof of L'Hospital's rule.
L'Hospital's Rule
If $\lim_{x \to a} f(x) = \lim_{x \to a} g(x) = 0$ (or both $\to \infty$) and $\lim_{x \to a} f'(x)/g'(x) = L$ exists, then $\lim_{x \to a} f(x)/g(x) = L$.
The rule allows one to resolve indeterminacies $0/0$ and $\infty/\infty$ by differentiating the numerator and denominator.
Examples:
- $\lim_{x \to 0} \sin(x)/x = \lim_{x \to 0} \cos(x)/1 = 1$ ✓
- $\lim_{x \to \infty} \ln(x)/x = \lim_{x \to \infty} (1/x)/1 = 0$
Indeterminacies $0 \cdot \infty$, $\infty - \infty$, $1^\infty$, $0^0$, $\infty^0$ are reduced to $0/0$ or $\infty/\infty$ by algebraic transformations.
Taylor's Formula
The mean value theorems are a special case of a more general result. Taylor's formula with the remainder in the Lagrange form:
$ f(x) = f(a) + f'(a)(x-a) + f''(a)(x-a)^2/2! + \ldots + f^{(n)}(a)(x-a)^n/n! + R_n(x), $
where $R_n(x) = f^{(n+1)}(c) (x-a)^{n+1}/(n+1)!$ for some $c$ between $a$ and $x$.
Taylor's formula is a "polynomial approximation": we approximate a complicated function by a polynomial, estimating the error. This is exactly how calculators and computers work when computing $\sin$, $\cos$, $\exp$.
Example: $\sin(x) \approx x - x^3/6 + x^5/120$ with error $|R_5(x)| \leq |x|^7/5040$.
Applications of the Mean Value Theorems
The mean value theorems are the foundation of proofs of inequalities. For example: $|\sin(x) - \sin(y)| \leq |x - y|$ follows from Lagrange's theorem, since $|\sin'(c)| = |\cos(c)| \leq 1$.
In numerical methods, Lagrange's theorem is used to estimate the error of interpolation, quadrature formulas, and numerical differentiation schemes.
L'Hospital's Rule in Detail
L'Hospital's rule applies to the indeterminacies $0/0$ and $\infty/\infty$, but requires caution. The main mistake is application without checking the conditions: both limits must be 0 or both infinity. Besides, the rule is applied to the ratio of derivatives, not to the derivative of the ratio.
Example with iteration: $\lim_{x \to 0} (e^x - 1 - x)/x^2$. Numerator and denominator $\to 0$. Once: $\lim (e^x - 1)/(2x)$—again $0/0$. Second time: $\lim e^x/2 = 1/2$.
Indeterminacy $1^\infty$: $\lim_{x \to 0} (\cos x)^{1/x^2}$. Write: $L = \lim \exp[(1/x^2)\ln \cos x]$. By L'Hospital: $\lim \ln(\cos x)/x^2 = \lim(-\sin x/\cos x)/(2x) = \lim(-\tan x)/(2x) = -1/2$. Consequently, $L = e^{-1/2}$.
Monotonicity and Estimates via the Mean Value Theorems
Lagrange's theorem gives concrete numerical estimates. Example: prove that $|\sin a - \sin b| \leq |a - b|$ for all $a, b$. By Lagrange's theorem: $\sin a - \sin b = \cos c \cdot (a - b)$ for some $c$. Since $|\cos c| \leq 1$, we obtain $|\sin a - \sin b| \leq |a - b|$—the Lipschitz inequality for sine.
Thought-provoking question: How can Lagrange's theorem be used to prove the inequality $\ln(1 + x) \leq x$ for all $x > -1$?
Generalized Cauchy Mean Value Theorem
Cauchy's theorem: if $f$ and $g$ are differentiable on $[a, b]$ and $g'(c) \neq 0$, then there exists $c \in (a, b)$ such that $(f(b) - f(a))/(g(b) - g(a)) = f'(c)/g'(c)$. For $g(x) = x$ this is Lagrange's theorem. Cauchy's theorem underlies the rigorous proof of L'Hospital's rule. Estimates via the derivative: $|f(b) - f(a)| \leq \max |f'| \cdot |b - a|$—the "Lipschitz" estimate. In ODE theory: the condition $|f(x, y_1) - f(x, y_2)| \leq L|y_1 - y_2|$ (Lipschitz) guarantees uniqueness of the solution to the Cauchy problem. In machine learning, loss functions with the Lipschitz condition are well-suited for optimization using gradient methods with predictable convergence rates.
§ Act · what next