Module IV·Article II·~6 min read

Sequent Calculus and Gentzen's Theorem

Proof Theory

Turn this article into a podcast

Pick voices, format, length — AI generates the audio

Motivation: Symmetry and Proof Normalization

Natural deduction is asymmetric: hypotheses are on the left, conclusion on the right, and the introduction/elimination of connectives is asymmetric. The sequent calculus (Gentzen, 1935) is a symmetric system in which proofs have an explicit structure convenient for metatheorems. The main result—the cut elimination theorem—states that any proof can be "normalized": the "borrowing of lemmas" can be removed.

Sequents and the LK Calculus

Sequent: Γ ⊢ Δ (read as "from the hypotheses Γ, at least one of Δ follows"). Γ and Δ are multisets of formulas.

Axiom: φ ⊢ φ (identity sequent).

Structural rules:

  • Weakening: from Γ ⊢ Δ → Γ, φ ⊢ Δ (and analogously on the right).
  • Contraction: from Γ, φ, φ ⊢ Δ → Γ, φ ⊢ Δ.
  • Exchange: permutation of formulas in Γ or Δ.

Logical rules (example for ∧ on the right): from Γ ⊢ Δ, φ and Γ ⊢ Δ, ψ → Γ ⊢ Δ, φ∧ψ.

Cut rule: from Γ ⊢ Δ, φ and φ, Π ⊢ Λ → Γ, Π ⊢ Δ, Λ. This is "using the lemma φ".

Cut Elimination Theorem (Cut Elimination / Hauptsatz)

Theorem (Gentzen, 1935): Any proof in LK can be transformed into a proof without the Cut rule—in normal form.

Corollary—the subformula property: In a normal proof, all formulas are subformulas of the proved sequent. One cannot "invent" auxiliary formulas that do not occur in the premise.

Proof: double induction—on the "rank" of the cut (complexity of the formula φ in the Cut) and the number of cuts. Key cases: “key cuts” — when φ is introduced on one side and eliminated on the other — are locally transformed into a proof without Cut.

Applications

Consistency of PA: Gentzen (1936), using transfinite induction up to ε₀ = ω^{ω^{ω^{...}}}, proved the consistency of Peano arithmetic PA. This does not contradict Gödel’s theorem: PA does not prove Con(PA) inside itself, but an external system (with ε₀-induction) can.

Automated proof search: Subformula Property → proof search is a finite problem (the number of candidates is bounded): basis of the Prolog algorithm and Beth tableaux.

Numerical Example

Problem: Derive φ ∧ ψ ⊢ φ ∨ ψ in LK without the Cut rule.

Step 1. Axiom: φ ⊢ φ.

Step 2. ∨-introduction on the right: φ ⊢ φ ∨ ψ.

Step 3. Axiom: ψ ⊢ ψ.

Step 4. ∨-introduction on the right: ψ ⊢ φ ∨ ψ.

Step 5. ∧-elimination on the left (two premises from steps 2 and 4): φ ∧ ψ ⊢ φ ∨ ψ. ■

Proof tree (schematically): axioms φ ⊢ φ and ψ ⊢ ψ → by rule ∨R₁ obtain φ ⊢ φ∨ψ and ψ ⊢ φ∨ψ → applying ∧L, combine the premises into φ∧ψ ⊢ φ∨ψ.

Subformula Property check: all formulas in the proof (φ, ψ, φ∧ψ, φ∨ψ) are subformulas of the sequent being proved φ∧ψ ⊢ φ∨ψ ✓.

Additional Aspects

Gentzen’s sequent calculus LK expresses proofs via sequents of the form Γ ⊢ Δ (from premises Γ, at least one of Δ is derivable). All rules are symmetric: for each logical operator, there is a rule for introduction on the left and on the right. The main result is the Hauptsatz theorem (1934) that the cut rule (corresponding to the lemma "if A entails B, and B entails C, then A entails C") is redundant: any proof can be transformed into cut-free form. This yields the subformula property and proves the consistency of the theory. In practice, cut elimination is the basis for automatic provers: tableau methods, focused proofs, and deep inference use structural normal forms for efficient proof search.

Real-World Application

Program verification with dependent types (Lean, Isabelle): Cut Elimination means that proofs can be "normalized" and checked efficiently. Without this theorem, the verifier could hang indefinitely when checking the correctness of a proof.

Gentzen’s LK calculus constructs proofs as structured trees of sequents, which allows for a rigorous proof of the normalization theorem without reference to semantics. This theorem opened the door to modern automated mathematics: tableau systems, focused proofs, focused linear logic, as well as most interactive theorem provers, are based on it.

Connection to Other Areas of Mathematics

Sequent calculus has become the standard language for the logical part of many areas of mathematics. In model theory of differential equations, Gentzen’s ideas are used, for example, in the work of Tarski and Seidenberg on quantifier elimination: structural transformations of formulas are organized via analogues of cut elimination, which allows for control of the complexity of proofs of the elementary character of solutions. In nonlinear algebra and the theory of fields, Gentzen’s result is combined with Gödel’s completeness theorem and Henkin’s technique: model construction often relies on cut-free proof trees, which facilitates the computation of consistent theory extensions.

In functional analysis and topology, sequent systems underlie the "Hilbert–Bernays–Gödel program": proofs of theorems like Hahn–Banach or Tychonoff are analyzed from the perspective of extracting constructive content via proof mining methods (Ulrich Kohlenbach, Ulrich Kohlenbach, Ulrich Kohlenbach–Kreisel). Here, cut elimination acts as the mechanism that separates purely logical steps from the use of the axiom of choice or completeness.

In probability theory, the connection arises via logics of probabilistic reasoning, where sequent systems for logics with measures (work of Fagin and Halpern) use modified cut rules, and their partial elimination yields effective procedures for deriving constraints on probabilities. In numerical methods, the logical foundations of algorithm verification (Floyd–Hoare methods, logic of dynamic systems by Pratt, PVS system) use variations of sequent calculi: the proof of correctness of integration schemes or finite element methods is formalized as cut-free trees, and the subformula property guarantees that no additional "hidden" conditions appear during verification.

Historical Background and Development of the Idea

The first forerunners of the sequent approach appeared with Skolem and Hilbert in the 1920s, but the term itself and the strict LK system were introduced by Gerhard Gentzen in papers in 1934–1935 in “Mathematische Zeitschrift”. The motivation came from Hilbert’s program: it was necessary to give a finitary analysis of classical mathematics and to prove the consistency of arithmetic. Gentzen proposed two parallel formalisms—natural deduction and sequent calculus—but it was the latter that allowed him to formulate the Hauptsatz. Already in 1936, in the same series of works, he applied cut elimination for the proof of the consistency of Peano arithmetic via transfinite induction up to ε₀. After World War II, Gentzen’s ideas were reinterpreted in the works of Gentzenka, Lorenzen, Schütte, and Tait. Lorenzen developed dialogical and operational interpretations of sequents; Schütte, in his book “Beweistheorie” (1960), systematically detailed cut-elimination technique for broad classes of theories. In the 1960s and 1970s, the works of Peter Schultze and Jean-Yves Girard connected sequent calculi with linear logic and λ-calculus, and Girardinho showed the connection of cut elimination with normalization in typed λ-systems.

§ Act · what next