Module I·Article II·~5 min read
Limit of a Numerical Sequence: Definition and Properties
Sets and Limits of Sequences
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Intuition and Rigorousness
The concept of the limit is the heart of mathematical analysis. Intuitively, we understand that the sequence 1/2, 1/4, 1/8, 1/16, ... “tends to zero”—its terms become arbitrarily small. But what does “arbitrarily” mean? How can we make this statement mathematically rigorous?
The answer was given by Augustin-Louis Cauchy in the first half of the 19th century, and Karl Weierstrass gave it its final form: the limit of a sequence via ε-N.
Definition of the Limit (ε-N)
A number $a$ is called the limit of the sequence ${a_n}$ if for any $\varepsilon > 0$ there exists a natural number $N$ such that for all $n > N$ the inequality $|a_n - a| < \varepsilon$ holds.
Notation: $\lim_{n \to \infty} a_n = a$.
Let us break down this definition word by word. “For any $\varepsilon > 0$”—we specify an arbitrarily small error. “There exists $N$”—starting from a certain index. “For all $n > N$, $|a_n - a| < \varepsilon$”—all subsequent terms lie within the $\varepsilon$-neighborhood of the number $a$.
Example: Let us prove that $\lim_{n \to \infty} \frac{1}{n} = 0$.
Let $\varepsilon > 0$ be given. Take $N = \lceil 1/\varepsilon \rceil$ (the least integer not less than $1/\varepsilon$). Then for all $n > N$ we have $n > 1/\varepsilon$, hence $1/n < \varepsilon$, that is, $|1/n - 0| < \varepsilon$. The definition is satisfied. ∎
Boundedness and Monotonicity
A convergent sequence is always bounded—this is an important theorem. The converse is not true: the sequence $(-1)^n$ is bounded ($|a_n| \leq 1$), but does not converge.
Weierstrass Theorem: A monotonic bounded sequence has a limit.
This is an existence theorem—it guarantees the limit without giving its explicit formula. A monotonically increasing bounded sequence converges to its supremum.
Arithmetic of Limits
If $\lim a_n = a$ and $\lim b_n = b$, then:
- $\lim (a_n + b_n) = a + b$
- $\lim (a_n \cdot b_n) = a \cdot b$
- $\lim (a_n / b_n) = a / b$ (for $b \neq 0$)
These rules make the calculation of limits convenient. For example:
$ \lim \frac{3n^2 + 2n + 1}{5n^2 - n + 3} = \lim \frac{3 + 2/n + 1/n^2}{5 - 1/n + 3/n^2} = \frac{3}{5}. $
Most Important Limits
First remarkable limit: $\lim_{n \to \infty} (1 + 1/n)^n = e \approx 2.71828...$
This is the definition of the number $e$—the base of the natural logarithm. The number $e$ is fundamental in mathematics, physics, and finance (continuous compounding of interest).
Limit of a geometric progression: If $|q| < 1$, then $\lim q^n = 0$. If $|q| > 1$—the sequence diverges.
The Squeeze Theorem
If $a_n \leq c_n \leq b_n$ for all $n > N$, and $\lim a_n = \lim b_n = L$, then $\lim c_n = L$.
This theorem (“the two policemen theorem”) is often applied where direct computation of the limit is difficult.
Divergent Sequences
Not every sequence has a limit. The sequence $(-1)^n$ diverges—it “jumps” between $-1$ and $1$. The sequence $n^2$ goes to infinity. Divergence can also be “structured”: $(-1)^n$ has two subsequential limits: $-1$ and $1$.
Cauchy Sequence
A sequence ${a_n}$ is called fundamental (Cauchy) if for any $\varepsilon > 0$ there exists $N$ such that for all $m, n > N$ the inequality $|a_n - a_m| < \varepsilon$ holds.
Cauchy's criterion: a sequence converges if and only if it is fundamental. This is a powerful tool: to prove convergence, it is not necessary to know the limit—it suffices to show that the terms “group together” around each other.
Infinitesimal and Infinite Sequences
A sequence is called infinitesimal if its limit is zero. Infinitesimals are the building blocks of mathematical analysis. Their sums, products, and ratios describe the behavior of functions near a given point.
Understanding the limits of sequences is the first step towards the limits of functions, derivatives, and integrals.
Limits in Algorithms and Computation
The concept of the limit is not purely theoretical. Iterative algorithms in computer science and numerical analysis are sequences converging to a solution. Newton's method for finding the root of the equation $f(x) = 0$ constructs the sequence $x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$. If the function is sufficiently smooth, this sequence converges to the root with a rate precisely described by the theory of limits.
In financial mathematics, the compound interest formula $(1 + r/n)^n$ as $n \to \infty$ tends to $e^r$—continuous compounding. It is precisely the limit of the sequence that determines how much a financial instrument is worth under continuous compounding.
Question for thought: The sequence $a_{n+1} = (a_n + 2/a_n)/2$ with $a_1 = 2$ converges. To what? (Hint: suppose the limit $L$ exists, and find $L$ from the equation $L = (L + 2/L)/2$.)
Rate of Convergence and Numerical Algorithms
In numerical analysis, the rate of convergence determines the practical value of an algorithm. Linear convergence: the error $\varepsilon_{n+1} \approx q \cdot \varepsilon_n$ ($q < 1$)—the error decreases $q$ times with each iteration. Quadratic: $\varepsilon_{n+1} \approx C \cdot \varepsilon_n^2$—each iteration doubles the number of correct digits. Newton’s method $x_{n+1} = x_n - f(x_n)/f'(x_n)$ converges quadratically near a simple root. The iteration $x_{n+1} = (x_n + 2/x_n)/2$ is Newton’s method for $f(x) = x^2 - 2$. Starting with $x_1 = 2$, already $x_4$ gives 15 correct digits of $\sqrt{2}$. In machine learning, the rate of convergence of gradient descent is similar: adaptive methods (Adam, RMSProp) accelerate convergence compared to basic SGD.
§ Act · what next