Module VII·Article II·~5 min read
Poisson Process and Continuous-Time Markov Chains
Stochastic Processes
Turn this article into a podcast
Pick voices, format, length — AI generates the audio
Poisson process is a standard model of events that occur randomly over time (calls to a call center, atomic decay, transactions). CTMCs describe systems with continuous time and discrete states.
Poisson Process
Definition: {N(t), t ≥ 0} is a Poisson process with rate λ if:
- N(0) = 0
- Increments on non-overlapping intervals are independent
- N(t + s) − N(t) ~ Poisson(λs) for all t,s > 0
Inter-event intervals: $T_i$ = time between the i-th and (i-1)-th event ~ Exp(λ), independent.
Superposition: The merging of two Poisson processes with rates λ₁ and λ₂ is a Poisson process with rate λ₁+λ₂. Thinning: each event is independently included with probability p — Poisson with rate pλ.
Continuous-Time Markov Chains (CTMC)
Transition rates (generator matrix Q): $q_{ij}$ = transition rate from i to j for i ≠ j. $q_{ii} = -\sum_{j \neq i} q_{ij}$. Kolmogorov equations: $P'(t) = P(t)Q$ (forward) or $P'(t) = QP(t)$ (backward). Solution: $P(t) = e^{tQ}$.
Stationary distribution: $\pi Q = 0$. Detailed balance: $\pi_i q_{ij} = \pi_j q_{ji}$ → reversibility.
M/M/1 queue model: Poisson arrivals (λ), exponential service (μ). For ρ = λ/μ < 1: stationary $\pi_n = (1-ρ)ρ^n$. $E[N] = ρ/(1-ρ)$. $E[W] = 1/(μ-λ)$ (mean delay).
Exercise: (a) Poisson process λ=3 events/hour. $P(N(2)=5)$? $E[N(4)]$? $P($interval
gt;$ 30 min$)$? (b) For M/M/1 with λ=0.8, μ=1: find π₀, $E[N]$, $E[W]$. What happens as λ→μ? (c) Bank with 2 cashiers (M/M/2): with λ=1.5, μ=1. Erlang C formula: find $P_{wait}$ and $E[W]$.Nonhomogeneous and Compound Poisson Processes
Nonhomogeneous Poisson process: The rate λ(t) depends on time. $N(t)\sim Poisson(Λ(t))$, where $Λ(t)=\int_0^t λ(s)ds$ is the cumulative intensity. Inter-event intervals are not exponential. Application: server load (peak during business hours), seismology.
Compound Poisson process: $Z(t) = \sum_{i=1}^{N(t)} Y_i$ where N is Poisson, $Y_i \sim$ i.i.d. $E[Z(t)] = λt·E[Y]$, $Var[Z(t)] = λt·E[Y^2]$. Used in insurance: total payouts over time t. Lévy process — generalization to infinite divisibility.
Queueing Theory: Extensions and Practice
PASTA Theorem (Poisson Arrivals See Time Averages): Poisson arrivals find the system in its stationary state. That is, the fraction of time when the system is in state n equals the probability that an arriving client finds n clients in the system. This simplifies queueing analysis.
Little’s formula: $L = λ·W$. Number of customers in the system = arrival rate × mean time in system. Universal for stationary systems without assumptions on distributions.
M/G/1 queue: Pollaczek-Khinchine formula: $E[N_q] = λ^2E[S^2]/(2(1−ρ))$, where S is service time, ρ = λ·E[S]. Depends only on the first two moments. High service time variance S increases the queue quadratically.
Continuous Markov Chains: More Details
For a CTMC with generator Q: transient rate $q_{ij} \geq 0$, $q_{ii} = -\sum_{j\neq i} q_{ij}$. Forward Kolmogorov equations: $dP/dt = PQ$. Backward equations: $dP/dt = QP$. Embedded Markov chain: observed at jump times → matrix P, with $P_{ij} = q_{ij}/(−q_{ii})$.
Reversible chains: detailed balance $\pi_i q_{ij} = \pi_j q_{ji}$ — easier to analyze. Most physical systems are reversible (principle of microreversibility). Irreversible chains: in telecommunication networks, biochemical reactions.
Computational Complexity of Queueing Theory Problems
Analytical solutions exist for a limited class: M/M/1, M/M/c, M/G/1, G/M/1. For queueing networks: Jackson’s theorem — networks of M/M/1 nodes have product-form stationary distributions. Discrete Event Simulation (DES): for general G/G/c — the only practical method. SimPy (Python) language for building models. Applications: logistics, manufacturing, cloud systems.
Nonlinear Markov Chains and Applications in AI
Reinforcement Learning (RL): agent in an environment — MDP. Policy π: state → distribution over actions. Expected discounted reward $V^{π}(s) = E\left[\sum γ^t r_t|s_0=s, π\right]$. Bellman equation: $V^{π}(s) = \sum_a π(a|s)[R(s,a) + γ\sum_{s'} P(s'|s,a)V^{π}(s')]$. Q-learning: $Q(s,a) ← Q(s,a) + α[r + γ \max_{a'} Q(s',a') − Q(s,a)]$. With decaying α and sufficient exploration: Q → Q* (optimal value function).
Random Walks: Recurrence and Transience
Pólya’s theorem: RW in $\mathbb{Z}^d$ is recurrent ($p_{return} = 1$) for $d \le 2$ and transient for $d \ge 3$. For RW in $\mathbb{Z}^1$: time to return to zero has $P(\tau_0 < ∞) = 1$ but $E[\tau_0] = ∞$ (null recurrent state). Applications: Pollard-Harris test for MCMC chains, percolation analysis, laminar flows in physics.
Covariance Structures in Continuous Processes
For process X(t) with given covariance function $K(s,t) = Cov(X(s), X(t))$: wide-sense stationary process if $K(s, t) = K(s - t)$ (depends only on lag). Spectral density: $S(ω) = \int K(τ)e^{−iωτ}dτ$ (Wiener-Khinchin theorem). For autoregressive processes: $S(ω) = \frac{σ^2}{|1−\sum a_k e^{−iωk}|^2}$. Applications in signal processing, time series analysis.
Numerical Example: Poisson Process and CTMC
Problem: Requests arrive as a Poisson process with λ=3/hour. (a) $P($first request after 30 min$)$. (b) $E[$time between 3rd and 5th$]$. (c) CTMC with two states λ₁₂=2, λ₂₁=3: stationary π.
Step 1: Inter-arrival $T_1\sim Exp(λ=3)$. $P(T_1 > 0.5$ hr$)=e^{−3·0.5}=e^{−1.5}\approx 0.223$.
Step 2: Between 3rd and 5th request — total interval $T_4 + T_5$ (two i.i.d. Exp(3)). $E[T_4+T_5]=2/3$ hr $\approx 40$ min. $Var=2/9$ hr$^2$.
Step 3: CTMC: rate matrix $Q=[[−2, 2],[3,−3]]$. Balance equation: $\pi_1·λ_{12}=π_2·λ_{21} \rightarrow 2\pi_1 = 3\pi_2 \rightarrow \pi_1 = 3\pi_2/2$. Normalization: $\pi_1+\pi_2=1 \rightarrow \pi_2=0.4, \pi_1=0.6$.
Step 4: Spectral gap: $−λ_{mixing}=λ_{12}+λ_{21}=5$ per hour — mixing rate. Mean time in state 1: $1/λ_{12}=0.5$ hr. $P(N(t)=k) = \frac{(λt)^k e^{−λt}}{k!}$: $P(N(1)=3)=\frac{3^3 e^{−3}}{6}=27·0.0498/6\approx 0.224$.
§ Act · what next