> [!theorem]
>
> Let $\alpha > 0$ and $\pi_\alpha$ be a [[Poisson Distribution|Poisson distribution]] with parameter $\alpha > 0$, then there exists a [[Lévy Process]] $N$ such that each $N_t \sim \pi_{t\alpha}$, known as the **simple Poisson process** with rate $\alpha$.
>
> *Proof*. Let $\seq{\theta_n}$ be a family of independent, [[Exponential Random Variable|exponentially distributed]] random variables with parameter $\alpha$ (with density $e^{-\alpha x} \cdot \one_{\bracs{x > 0}}$).
>
> Assume without loss of generality that $\theta_n > 0$ for all $n \in \nat$. Let $T_0 = 0$ and $T_n = \sum_{k \le n}\theta_k$, then $T_n$ is [[Gamma Distribution|gamma distributed]] (with density $\frac{\alpha^n}{(n - 1)!}e^{n - 1}e^{-\alpha x} \one_{\bracs{x > 0}}$).
>
> Let $N_0 = 0$ and $N_t = \max\bracs{n \ge 0: T_n \le t}$. Since $N_t$ is right-continuous, it is [[RCLL]]. Firstly, if $N_t(\omega) = 0$, then $T_1(\omega) > t$. If $N_t(\omega) = n$, then $T_n(\omega) \le t$ but $T_{n + 1}(\omega) > t$. Therefore
> $
> \bp\bracs{N_t = 0} = \bp\bracs{\theta_1 > t} = \alpha e^{-\alpha t}
> $
> and
> $
> \bp\bracs{N_t = n} = \bp\bracs{T_n \le t} - \bp\bracs{T_{n + 1} \le t} = \frac{(\alpha t)^n}{n!}e^{-\alpha t}
> $
> so $N_t \sim \pi_{\alpha t}$. Lastly, let $\bracs{t_j}_0^k$ be a strictly increasing sequence of timestamps with $t_0 = 0$, and $\bracs{n_j}_0^k$ be a nondecreasing sequence of integers, then we require
> $
> \begin{align*}
> \bp\bracs{\bigcap_{j = 1}^k\bracs{N_{t_j} = n_j}} &= \bp\bracs{\bigcap_{j = 1}^k\bracs{N_{t_j} - N_{t_{j - 1}} = n_{j} - n_{j - 1}}} \\
> &= \prod_{j = 1}^k\frac{[\alpha(t_{j} - t_{j - 1})]^{n_j - n_{j - 1}}}{n_j - n_{j - 1}}e^{-\alpha(t_j - t_{j - 1})}
> \end{align*}
> $
> so
> $
> \begin{align*}
> \bp\bracs{\bigcap_{j = 1}^k\bracs{N_{t_j} = n_j}} \le \bp\bracs{T_{n_1} \le t_1 < T_{n_{j + 1}} \le \cdots \le T_{n_k} \le t_{k} < T_{n_k + 1}}
> \end{align*}
> $
> since the $T_n$s are sums of independent exponential random variables, the above event concerns $\bracs{\theta_j}_1^{n_k + 1}$. By independence, the above random variables have joint density
> $
> F(x) = \alpha^{n_k + 1} \prod_{m = 1}^{n_k + 1} e^{-\alpha x_m}
> $
> Pulling the last integrand out,
> $
> \int F(x)dx_1 \cdots d_{x_{n_k }} dx_{n_k + 1} = e^{-\alpha t_k}\alpha^{n_k}\text{Vol}(B)
> $
> where
> $
> B = \bracs{x: \sum_{m = 1}^{n_1}x_m \le t_1 \le \sum_{m = 1}^{n_1+1}x_m \le \cdots \le \sum_{m = 1}^{n_k}x_m \le t_1 \le \sum_{m = 1}^{n_k+1}x_m}
> $
> which yields the desired probability.