Module III·Article III·~5 min read
Eulerian and Hamiltonian Graphs. Flows in Networks
Graph Theory
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Eulerian and Hamiltonian Paths
Eulerian Paths
Eulerian path — a route that traverses each edge exactly once.
Euler's Theorem (1736): A graph has an Eulerian cycle if and only if it is connected and all vertices have even degree.
An Eulerian path (not cycle) exists ⟺ the graph is connected and exactly 2 vertices have odd degree.
The Königsberg bridges problem: seven bridges over a river — is it possible to cross each one exactly once? Four islands, all with odd degree → no.
Hamiltonian Paths
Hamiltonian path — a path passing through every vertex exactly once.
Unlike Eulerian: there is no simple criterion! The problem of finding a Hamiltonian cycle is NP-complete.
Sufficient condition (Dirac's theorem): If each vertex has degree ≥ n/2 (n ≥ 3), then a Hamiltonian cycle exists.
The Traveling Salesman Problem (TSP): minimal Hamiltonian cycle in a complete weighted graph. NP-hard; approximate algorithms: 2-opt, Lin–Kernighan.
Flows in Networks
Network: a digraph with a source s and sink t, capacities c(u,v) ≥ 0.
Flow f: a function f(u,v) with 0 ≤ f(u,v) ≤ c(u,v) and conservation of flow at every vertex (except s and t).
Ford–Fulkerson Theorem (Maximum Flow = Minimum Cut): max f = min C(S,T) over all cuts.
Edmonds–Karp algorithm (BFS): O(VE²). Dinic's algorithm: O(V²E).
Applications: transportation networks, task distribution, maximum matching (bipartite graphs).
Ford–Fulkerson Algorithm: Details and Correctness
Idea: Search for an augmenting path from s to t in the residual network G_f, where residual capacity r(u,v) = c(u,v) − f(u,v) for forward edges and r(v,u) = f(u,v) for backward. Increase the flow along the path by the minimum residual capacity.
Ford–Fulkerson Theorem: The flow is maximum if and only if there is no augmenting path in G_f. Equivalent: max flow = min cut (max-flow/min-cut theorem).
Edmonds–Karp version (BFS for path selection): O(VE²). With irrational capacities, the basic Ford–Fulkerson algorithm may not converge, but the BFS version always terminates.
Eulerian Circuits: Fleury's Algorithm
The Fleury algorithm constructs an Eulerian path greedily: start at a vertex of odd degree (or any vertex), at each step choose an edge that is not a bridge (if there is an alternative). Delete the traversed edge. In O(E²) time we find an Eulerian path. A more efficient Hierholzer algorithm works in O(E): build cycles greedily and then stitch them together.
Chinese postman problem: Find the minimal route for a postman covering each street at least once. If the graph is Eulerian — it's an Eulerian cycle (optimal). Otherwise — you need to "duplicate" edges between odd-degree vertices (minimum matching of odd vertices with minimum total weight). Solved in O(V³).
Hamiltonian Paths and the Traveling Salesman Problem
TSP (Traveling Salesman Problem) — NP-hard: for n cities, a brute-force search over all routes takes O(n!). Exact methods (branch and bound, Bellman-Held-Karp dynamic programming O(n²2ⁿ)) are practical up to n ≈ 20-30. For large n — approximate algorithms: nearest neighbor algorithm (greedy, not optimal), 2-opt (iterative improvement), Christofides algorithm (guaranteed ≤ 3/2 of optimum for metric instances).
Edge Coloring (Chromatic Index)
Chromatic index χ'(G) — minimal number of colors for a proper edge coloring (adjacent edges have different colors). Vizing's Theorem (1964): Δ ≤ χ'(G) ≤ Δ+1 (Δ — maximum degree). Class 1: χ'(G) = Δ. Class 2: χ'(G) = Δ+1. Determining the class is NP-hard in general. For bipartite graphs: χ'(G) = Δ (König's theorem). Application: scheduling, coloring time intervals in telecommunications.
Perfect Graph Theory
Perfect graph: χ(G) = ω(G) for any induced subgraph (ω — clique number). Weak perfect graph theorem (Lovász, 1972): G is perfect ↔ complement of G is perfect. The Chudni-Robertson-Chudnovsky-Thomas theorem (2006): G is perfect ↔ does not contain odd holes (odd cycles of length ≥ 5) or their complements — “Strong Perfect Graph Theorem.”
Perfect graphs include: bipartite, chordal, comparability, interval graphs. For perfect graphs, χ(G) = ω(G) is computed in polynomial time via SDP (Grotschel's ellipsoid method).
Random Graphs: Erdős–Rényi Model
G(n,p) — random graph on n vertices, each edge independently with probability p. Threshold theory: if p < (1−ε)ln(n)/n the graph is almost surely not connected, if p > (1+ε)ln(n)/n — almost surely connected. Triangle threshold: p ~ n^{−1}. Hamiltonian cycle threshold: p ~ ln(n)/n. These "threshold functions" were discovered by Erdős and Rényi (1960) and led to the "evolution of random graphs."
Expander Graphs and Their Applications
Expander: a family of d-regular graphs Gₙ with |V|→∞, where spectral gap λ₁−λ₂ ≥ ε > 0 (λ₁=d — largest eigenvalue of adjacency matrix). Expanders have good mixing properties: random walks quickly mix. Applications: derandomization of algorithms (expanders replace randomness), codes (Turner-Sipser), network algorithms. Construction: Ramanujan groups — explicit expanders via modular forms theory.
Numerical Example: Connectivity Threshold in G(n,p)
Problem: G(n=100, p). Find p* and check the probability of isolated vertices at p=0.046.
Step 1: Critical value: p* = ln(n)/n = ln(100)/100 = 4.605/100 = 0.0461.
Step 2: P(a specific vertex is isolated) = (1−p)^{99} ≈ e^{−99p}. At p=0.046: e^{−99·0.046}=e^{−4.554}≈0.0105.
Step 3: E[number of isolated vertices] = 100·0.0105 ≈ 1.05. Exactly at the threshold we expect about 1 isolated vertex.
Step 4: At p=0.07 (above threshold): e^{−99·0.07}=e^{−6.93}≈0.0010. E[isolated]≈0.1 — the graph is almost surely connected. For G(n,p) the second eigenvalue of adjacency matrix A concentrates near 2√(np(1−p)) (Furedi-Komlos theorem): at n=100, p=0.07 this is ≈2√(6.51)≈5.1, whereas the first eigenvalue ≈np=6.93. Spectral gap is positive — graph is connected with high probability.
§ Act · what next