> [!definition]
>
> Let $X = \bracs{X_t: t \ge 0}$ be a [[Stochastic Process|stochastic process]] on $(\Omega, \cf, \bp)$. $X$ is a **Lévy process** if
> 1. $X(\omega) \in D([0, \infty))$ is [[RCLL]] almost surely.
> 2. $X_0 = 0$.
> 3. $X$ has independent and homogenous increment, that is, for any $0 \le s \le t$, $X_t - X_s$ is [[Probabilistic Independence|independent]] from $\bracs{X_r: r \le s}$. Moreover, the distribution of $X_t - X_s$ depends only from $t - s$.
> [!theorem]
>
> Let $X$ be a Lévy process. If $\mu$ is the distribution of $X_1$, then $\mu$ is [[Infinitely Divisible Distribution|infinitely divisible]]. Moreover, for any $t > 0$, if $\mu_t$ is the distribution of $X_t$, then $\wh \mu_t = e^{t\ell_{\wh \mu}}$.
>
> *Proof*. Let $n \in \nat$, then $X_1 = \sum_{j = 1}^n X_{j/n} - X_{(j-1)/n}$ is a sum of independent, identically distributed random variables. Thus $\mu_{t = 1/n} = \mu_{1/n}$. For any rational number $q$, $\wh \mu_q = e^{q\ell_{\wh \mu}}$, and for any irrational number, convergence of characteristic functions combined with RCLL implies that $\wh \mu_t = e^{t\ell_{\wh \mu}}$.
> [!theorem]
>
> Let $(\Omega, \cf, \{\cf_t\}, \bp)$ be a [[Filtration|filtered]] probability space and $\bracs{X_t: t \ge 0}$ be a $\{\cf_t\}$-adapted process with RCLL sample paths such that $X_0 = 0$. Let $\mu \in \mathcal I(\real^d)$ be an infinitely divisible law, and $\ell_{\wh \mu}$ be the principal logarithm of its [[Characteristic Function|characteristic function]]. For any $\xi \in \real^d$, define
> $
> E_t^\xi = \exp\braks{i\angles{\xi, X_t} - t\ell_{\wh \mu}(\xi)}
> $
> Then $\{X_t\}$ is a Lévy process associated with $\mu$ if and only if for all $\xi \in \real^d$, $E_t^\xi$ is a $\{\cf_t\}$-[[Martingale|martingale]].
>
> *Proof*. Suppose that $X$ is a Lévy process associated with $\mu$. Since $E_t^\xi$ is a continuous transformation of a RCLL process, it is RCLL as well. As $X_t$ is already adapted, $E_t^\xi$ is progressively measurable. Moreover, since $E_t^\xi$ is bounded, it is integrable.
>
> To check the conditional expectation, factor out the product:
> $
> \begin{align*}
> \ev\braks{E_t^\xi|\cf_s} &= \ev\braks{e^{i\angles{\xi, X_t}}|\cf_s} \cdot e^{-t\ell_{\wh \mu}} \\
> &= \ev\braks{e^{i\angles{\xi, X_t - X_s}}e^{i\angles{\xi, X_s}}|\cf_s} \cdot e^{-t\ell_{\wh \mu}} \\
> &= e^{i\angles{\xi, X_s} - t\ell_{\wh \mu}(\xi)}\cdot \ev\braks{e^{i\angles{\xi, X_t - X_s}}|\cf_s}\\
> &= e^{i\angles{\xi, X_s} - t\ell_{\wh \mu}(\xi)}\cdot \ev\braks{e^{i\angles{\xi, X_t - X_s}}} \\
> &= e^{i\angles{\xi, X_s} - t\ell_{\wh \mu}(\xi)}e^{(t - s)\ell_{\wh \mu}(\xi)} \\
> &= e^{i\angles{\xi, X_s} - s\ell_{\wh \mu}(\xi)}
> \end{align*}
> $
>
> Now suppose that for all $\xi \in \real^d$, $E_t^\xi$ is a martingale, so $\ev(E_t^\xi)$ is constant as a function of $t$ and stays $1$. From here, we can extract the characteristic function
> $
> \ev\braks{e^{i\angles{\xi, X_t}}} = e^{t\ell_{\wh \mu}(\xi)}
> $
> Let $0 \le r \le s \le t$, and $\xi_1, \xi_2 \in \real^d$. By checking against the characteristic function,
> $
> \begin{align*}
> \ev\braks{e^{i\angles{\xi_1, X_t - X_s}}e^{i\angles{\xi_2, X_r}}}
> &= \ev\braks{\ev\braks{e^{i\angles{\xi_1, X_t - X_s}}e^{i\angles{\xi_2, X_r}}|\cf_s}} \\
> &= \ev\braks{e^{i\angles{\xi_2, X_r}}\ev\braks{e^{i\angles{\xi_1, X_t - X_s}}|\cf_s}}
> \end{align*}
> $
> At this point, we can write
> $
> \begin{align*}
> \ev\braks{e^{i\angles{\xi_1, X_t - X_s}}|\cf_s} &= e^{t\ell_{\wh \mu}(\xi_1)}\ev\braks{e^{i\angles{\xi_1, X_t} - i\angles{\xi_1, X_s} - t\ell_{\wh \mu}(\xi_1)}|\cf_s} \\
> &= e^{t\ell_{\wh \mu}(\xi_1)} \cdot \ev\braks{e^{i\angles{\xi_1, X_s}}} \cdot \ev\braks{X_t^{\xi_1}|\cf_s} \\
> &= e^{(t - s)\ell_{\wh \mu}(\xi_1)}
> \end{align*}
> $
> and doing something similar to the other term yields the desired result.