§ CALCULUS · 14 MIN READ · Updated 2026-05-13

Series and Sequences: Convergence Tests Explained

The beautiful subject most calculus students skim — and the convergence tests that solve almost any series.

"The integers are the work of God; all else is the work of man."
Leopold Kronecker, attributed (1886)
Series and Sequences: Convergence Tests Explained
SERIES AND SEQUENCES: CONVERGENCE TESTS EXPLAINED

A sequence is an ordered list of numbers: A series is the sum of the terms of a sequence: . Surprisingly, some infinite series sum to finite values. Determining which ones do — and computing their sums — is the subject of this article.

This article covers what sequences and series are, the formal definition of convergence, the geometric series (the most important example), the major convergence tests with worked examples, power series and Taylor expansion, and the common pitfalls.

Sequences

A sequence is a function from positive integers to real numbers: . Notation: or .

Examples:

  • :
  • :
  • :

Convergence of a sequence. A sequence converges to a limit if . Otherwise it diverges.

The first sequence above converges to 0. The third diverges to infinity. The second neither converges nor diverges to infinity — it oscillates.

Series

A series is the sum of a sequence's terms. We define the partial sum:

The series converges if the sequence of partial sums converges. Otherwise it diverges.

Critical distinction. A sequence and a series are different. The sequence converges to 0. The series diverges to infinity (this is the harmonic series, a classic).

The geometric series

The most important series in mathematics. For :

For , the series diverges.

Proof sketch. The partial sum is . Multiply by : . Subtract: , so . As , if , , and .

Example 1: .

The geometric series appears in compound interest, present-value calculations, probability problems (waiting times, geometric distribution), and many physics applications.

The harmonic series and the divergence test

The harmonic series is

This diverges to infinity, even though the terms get arbitrarily small. The proof: group terms as Each group is at least , and there are infinitely many groups.

This is one of the most counterintuitive results in calculus. Terms going to zero is necessary but not sufficient for convergence.

Divergence test (the contrapositive): If , then diverges.

This is useful only to rule out convergence. If , you cannot conclude anything — you need other tests.

The convergence tests

A small set of tests handles almost any series you encounter.

The ratio test

If :

  • : series converges.
  • : series diverges.
  • : test is inconclusive.

Example 2: Test .

Compute the ratio:

As , this approaches . Since , the series converges.

The root test

If :

  • : converges.
  • : diverges.
  • : inconclusive.

Useful when contains th powers.

Example 3: Test .

. Since , the series converges.

The comparison test

If for all :

  • If converges, so does .
  • If diverges, so does .

In words: comparing to a known series. The harder version is the limit comparison test: if (finite), then and either both converge or both diverge.

Example 4: Test .

Compare to . The ratio . So both series converge (since converges to ).

The integral test

If is positive, decreasing, and continuous for , and :

Example 5 (the p-series): Test for various .

By integral test, compare to . This converges if and diverges if . So the p-series converges for and diverges for .

This is one of the most-used convergence facts in practice.

The alternating series test

For an alternating series (where ): if is decreasing and , the series converges.

Example 6: .

The series converges (alternating series test) even though the absolute series diverges. This is called conditional convergence.

Absolute vs conditional convergence

A series converges absolutely if converges. It converges conditionally if converges but diverges.

Absolute convergence is stronger. Absolutely convergent series can be rearranged without changing the sum. Conditionally convergent series can be rearranged to converge to any real number (Riemann rearrangement theorem) — a striking and counterintuitive result.

Power series

A power series is a series of the form

where are coefficients and is a center point. The radius of convergence is the value such that the series converges for and diverges for .

The radius can be computed by the ratio test: .

Taylor series

Any "nice enough" function can be written as a Taylor series around a point :

This is one of the most useful tools in applied mathematics. It approximates complicated functions by polynomials.

Common Taylor series (around ):

Calculators compute , , , and other transcendental functions by evaluating Taylor series to sufficient precision.

Common pitfalls

Pitfall 1 — Confusing the divergence test with a sufficient condition. If , you cannot conclude convergence. The harmonic series is the famous counterexample.

Pitfall 2 — Misapplying the ratio test. It's inconclusive when . Don't conclude anything in that case — try another test.

Pitfall 3 — Rearranging conditionally convergent series. The order matters. If you rearrange a conditionally convergent series, you can change its sum. Be careful in computations.

Pitfall 4 — Forgetting the radius of convergence for Taylor series. 's Taylor series only works for (and ). Using it for gives nonsense.


Frequently asked

Is the sum of an infinite series always finite?
No — that's the whole point of convergence. Some series sum to finite values (converge), some don't (diverge). Determining which is the work.
Why does the harmonic series diverge if its terms go to zero?
Because the terms don't go to zero *fast enough*. Even though $1/n \to 0$, the cumulative sum grows logarithmically: $S_n \approx \ln n$, which diverges. The convergence depends on the rate at which terms decrease, not just whether they decrease.
What's the difference between a sequence and a series?
A sequence is a list. A series is a sum. The sequence $\{1, 1/2, 1/4, 1/8, \ldots\}$ converges to 0. The series $1 + 1/2 + 1/4 + 1/8 + \cdots$ converges to 2.
Why are Taylor series important?
They let you approximate complicated functions with polynomials, which are computable. They're used in calculators, in physics (small-angle approximations), in scientific computing, and as theoretical tools in analysis.
Can a divergent series have a meaningful sum?
In the strict sense, no — divergent means no sum. But there are extended notions (Cesàro summation, Abel summation, zeta regularization) that assign values to some divergent series. These are useful in physics but not standard calculus.

— ACT —


Cited works & further reading

  • ·Stewart, J. (2020). Calculus: Early Transcendentals, 9th edition. Cengage. — Chapter 11.
  • ·Spivak, M. (2008). Calculus, 4th edition. Publish or Perish. — Chapters 22–23.
  • ·Apostol, T. (1991). Calculus, Volume I. Wiley. — Series chapter.

More from this cluster


About the author

Tim Sheludyakov writes the Stoa library.

By Tim Sheludyakov · Edited 2026-05-13

A letter from the portico

Once a week — a long-read, a quote, a practice. No promotions. Unsubscribe in one click.

By subscribing you agree to receive letters from Stoa.