> [!theorem]
>
> Let $M \in \mathfrak{M}_0(\real^d)$ be a finite [[Measure Space|measure]] with $M(\bracs{0}) = 0$, then there exists a [[Lévy Process]] for the compound [[Poisson Distribution|Poisson distribution]] $\pi_{M}$.
>
> *Proof*. By normalising with respect to $M(\real^d)$, $M = \alpha \nu$ where $\alpha > 0$ and $\nu$ is a [[Probability|probability]] measure. Let $\bracs{\theta_m: m \ge 1}$ and $\bracs{T_n: n \ge 0}$, and $\bracs{N_t: t \ge 0}$ be the same as in the [[Simple Poisson Jump Process|simple Poisson jump process]]. That is, $\theta_m$ is exponentially distributed with parameter $\alpha$, $T_n = \sum_{k \le n}\theta_k$, and
> $
> N_t(\omega) = \max\bracs{n \ge 0: T_n \le t}
> $
>
> Let $\seq{Y_k}$ be a family of i.i.d. random variables on the same probability space with distribution $\nu$, such that $\seq{Y_k}$ is independent from $\seq{\theta_k}$. Let $X_0 = 0$ and let
> $
> X_t = \sum_{k = 1}^{N_t}Y_k
> $
> be the sum $\seq{Y_k}$ randomly stopped at $N_t$. We now claim that $X_t$ is a Lévy process such that $X_s - X_t = \pi_M$.
>
> Since $N_t(\omega)$ is piecewise constant and right continuous, so is $X_t(\omega)$.
>
> It is sufficient to show that for any strictly increasing sequence $\seqf{t_j}$ with $t_0 = 0$, the joint distribution of $(X_{t_j} - X_{t_{j- 1}})_{j \in [n]}$ is equal to the product
> $
> \prod_{j = 1}^n \pi_{(t_{j} - t_{j - 1})M}
> $
> Let $\seqf{\xi_j} \subset \real^d$, we can express the joint [[Characteristic Function|characteristic function]]
> $
> \varphi(\xi_1, \cdots, \xi_n) = \ev\braks{\exp\paren{i\sum_{j = 1}^k\angles{X_{t_j} - X_{t_{j - 1}}, \xi_j}}}
> $
> By summing over all possible values of sums,
> $
> \varphi(\cdots) = \sum_{0 \le n_0 \le \cdots \le n_n}\ev\braks{\exp(\ell); N_{t_1} = n_1, \cdots,N_{t_n} = n_n}
> $
> This allows concretely describing the inner sum
> $
> \ell = i\sum_{j = 1}^k \braks{\angles{\sum_{p = n_{j - 1} + 1}^{n_j}Y_p, \xi_j}}
> $
> and the conditions as
> $
> N_{t_j} - N_{t_{j - 1}} = n_{j} - n_{j- 1}
> $
> By independence and the [[Factorisation Formula|factorisation formula]],
> $
> \begin{align*}
> \varphi(\cdots) &= \sum_{\cdots} \braks{\prod_{j = 1}^n(\wh \nu(\xi_j))^{n_j - n_{j - 1}}} \\
> &\times \braks{\prod_{j = 1}^k \frac{e^{-\alpha(t_j - t_{j - 1})}(\alpha (t_j - t_{j - 1}))^{n_j - n_{j - 1}}}{(n_j - n_{j - 1}!)}}
> \end{align*}
> $
> By regrouping the two products, and summing over the differences iteratively,
> $
> \begin{align*}
> \varphi(\cdots) &= \sum_{\cdots}\prod_{j = 1}^k\frac{e^{-\alpha (t_{j} - t_{j - 1})}}{(n_j - n_{j - 1})!}\paren{\alpha \wh \nu(\xi_j)(t_j - t_{j - 1})}^{n_j - n_{j- 1}} \\
> &= \sum_{\cdots}\prod_{j = 1}^k e^{-\alpha(t_j - t_{j - 1})}e^{\alpha(t_j - t_{j - 1})\wh \nu(\xi_j)} \\
> &= \prod_{j = 1}^k \exp\braks{(t_j - t_{j- 1})\int (e^{i\angles{x, \xi_j}} - 1) dM(x)} \\
> &= \prod_{j = 1}^k \wh \pi_{(t_j - t_{j - 1}M)}(\xi_j)
> \end{align*}
> $
# Properties of Finite Compound Poisson Processes
Let $X = \bracs{X_t: t \ge 0}$ be a compound Poisson process associated with $M \in \mathfrak{M}_0$.
> [!theorem]
>
> For any $t \ge 0$ and $\omega \in \Omega$. Let $j(t, dy, X(\omega))$ be the jump measure associated with $X(\omega)$. Then for almost surely $\omega \in \Omega$, $j(t, dy, X(\omega)) \in \mathfrak{M}_0(\real^d)$.
>
> *Proof*. Since for each $\omega \in \Omega$, the value of $X_t$ is a randomly stopped sum, the number of jumps is bounded by the number of terms in the sum:
> $
> j(t, \real^d, X({\omega})) = \sum_{s \in J(t, X(\omega))} \delta_{X_s(\omega) - X_{s-}(\omega)}(\real^d) \le N_t(\omega)
> $
> as $N_t$ is finite for almost every $\omega \in \Omega$, so is $j$.
> [!theorem]
>
> Let $\varphi \in BC$ be a Borel function, then the integral
> $
> \int_{\real^d}\varphi(t)j(t, dy, X(\omega)) = \sum_{k = 1}^{N_t(\omega)}\varphi(Y_k(\omega))
> $
> is finite valued for almost every $\omega \in \Omega$. Therefore the mapping $\omega \to \int_{\real^d}\varphi(y)j(t, dy, X(\omega))$ is a stochastic process.
> [!theorem]
>
> Let $\varphi$ be the identity map, then
> $
> \int_{\real^d} y j(t, dy, X) = \sum_{k = 1}^{N_t}Y_k = X_t
> $
> we recover the process as an integral.
> [!theorem]
>
> If $\varphi$ is the absolute value function, then
> $
> \norm{X_t}_{V, [0, t]} = \sum_{k = 1}^{N_t}\abs{Y_k} = \int \abs{y}j(t, dy, X)
> $
> we recover the total variation.
> [!theorem]
>
> Suppose that $\int \abs{x} dM(x) < \infty$, then
> $
> \begin{align*}
> \ev\braks{\norm{X}_{V, [0, t]}} &= \ev\braks{\sum_{k = 1}^{N_t}\abs{Y_k}} = \sum_{j = 0}^\infty\ev\braks{\sum_{k = 1}^j\abs{Y_k}; N_t = j} \\
> &= \ev\braks{N_t} \cdot \ev{(\abs{Y_1})} \\
> &= \alpha t \int_{\real^d}\abs{y}dM(y) < \infty
> \end{align*}
> $
> the total variation is integrable with $\norm{X}_{V, [0, t]} \in L^1(\bp)$, thus $X_t \in L^1(\bp)$ as well with
> $
> \ev(X_t) = t\int_{\real^d}ydM(y)
> $
> [!theorem]
>
> Suppose that $\int \abs{x}^2 dM(x) < \infty$, then
> $
> \braks{\int_{\real^d}\abs{x}dM(x)}^2 \le \int_{\real^d}\abs{x}^2 dM(x) \cdot M(\real^d) < \infty
> $
> and
> $
> \begin{align*}
> \ev\braks{\abs{X_t}^{2}} &= \ev\braks{\angles{X_t, X_t}} \\
> &= \ev\braks{\sum_{j, k = 1}^{N_t}\angles{Y_j, Y_k}} \\
> &= \ev[N_t]\cdot \ev{\braks{\abs{Y_1}^2}} + \ev\braks{N_t^2 - N_t} \cdot \abs{\ev\braks{Y_1}}^2 \\
> &= t \int \abs{x}^2 dM(x) + t^2\abs{\int_{\real^d}xdM(x)}^2 < \infty
> \end{align*}
> $
> so $X_t \in L^2(\bp)$.
> [!theorem]
>
> Let $\Gamma \in \cb(\real^d)$ be a Borel set, and for any $t \ge 0$, let
> $
> j(t, \Gamma, X) = \sum_{k = 1}^{N_t}\one_{\Gamma}(Y_k)
> $
> then the mapping $t \mapsto j(t, \Gamma, X)$ is a non-negative, non-decreasing, piecewise constant, and [[RCLL]] function. Moreover, $j(t, \Gamma, X)$ is a simple [[Simple Poisson Jump Process|simple Poisson jump process]] associated with rate $M(\Gamma)$.
>
> *Proof*. From here we can directly establish
> $
> \begin{align*}
> &\bp\bracs{j(t, \Gamma, X) - j(s, \Gamma, X)} \\
> &= \sum_{k = 0}^\infty \sum_{l = n}^\infty \bp\bracs{j(t, \Gamma, X) - j(s, \Gamma, X) = n, N_s = k, N_t - N_s = l} \\
> &= \sum_{k = 0}^\infty \sum_{l = n}^\infty\bp\bracs{\abs{\bracs{p\in (k, k + l]: , Y_p \in \Gamma}} = n, N_s = k, N_t - N_s = l} \\
> &= \sum_{k = 0}^\infty\sum_{l = n}^\infty \bp\bracs{N_s = k} \cdot \bp\bracs{N_t - N_s = l} \\
> &\cdot \bp\bracs{\abs{\bracs{p: k + 1 \le p \le k + l, Y_p \in \Gamma}} = n} \\
> &= \sum_{l = n}^\infty e^{-\alpha(t - s)}\frac{(\alpha(t - s))^l}{l!} \cdot {l \choose n}\bp\bracs{Y_1 \in \Gamma}^n \cdot \bp\bracs{Y_1 \not\in \Gamma}^{l - n} \\
> &= \frac{e^{-\alpha(t - s)}(\alpha(t - s))^n (\nu(\Gamma))^n}{n!}\sum_{l = n}^\infty\frac{\braks{\alpha(t - s)(1 - \nu(\Gamma))}^{l - n}}{(l - n)!} \\
> &= e^{-(t - s)M(\Gamma)}\frac{((t - s)M(\Gamma))^n}{n!}
> \end{align*}
> $
> which is the Poisson distribution with parameter $(t - s)M(\Gamma)$.
> [!theorem]
>
> Let $A, B \in \cb(\real^d)$, then for any $0 \le r \le s \le t$ and $m, n \in \nat$,
> $
> \begin{align*}
> &\bp\bracs{j(r, A, X) = m, j(t, B, X) - j(s, B, X) = n} \\
> &= e^{-rM(A)} \frac{\braks{rM(A)}^{m}}{m!} \cdot e^{-(t - s)M(B)}\frac{\braks{(t - s)M(B)}^n}{n!}
> \end{align*}
> $
> In other words, $\bracs{j(r, A, X): 0 \le r \le s}$ is independent from $j(t, B, X) - j(s, B, X)$.
>
> *Proof*. Firstly, write the probability as
> $
> \begin{align*}
> \sum_{k = m}^\infty \sum_{p = 0}^\infty \sum_{q = n}^\infty \bp\bracs{j(r, A, X) = m, j(t, B, X) - j(s, B, X) = n; N_r = k, N_s - N_r = p, N_t - N_s = q}
> \end{align*}
> $
>
> Same as before folks.
> [!theorem]
>
> Let $\seq{E_n}$ be a disjoint family of Borel sets, then $\seq{(j(t, E_n, X), t \ge 0)}$ is an independent collection of processes.
>
> *Proof (for a pair of processes)*. Let $m, n \in \nat$ and $0 \le s \le t$, then we can show that $j(s, E_m, X)$ and $j(t, E_n, X)$ are independent. From here,
> $
> \ev\braks{\exp\braks{i\xi_1j(t, E_m, X) + i\xi_2j(s, E_n, X)}}
> $
> This allows separating
> $
> \begin{align*}
> &\ev\braks{\exp[i\xi_1(j(t, E_m, X) - j(s, E_m, X))]} \\
> &\times\ev\braks{\exp[i\xi_1j(s, E_m, X) + i\xi_2j(s, E_n, X)]} \\
> &= \exp((t - s)M(E_m)(e^{i\xi_1} - 1)) \\
> &\times \sum_{l \in \nat}\ev\braks{\exp\paren{i\xi_1\sum_{j = 1}^l\one_{E_m}(Y_j) + i\xi_2\sum_{j = 1}^l\one_{E_n}(Y_j)}; N_s = l}
> \end{align*}
> $
> where the sum can be decomposed as
> $
> \sum_{l \in \nat}\braks{\int e^{i\xi_1\one_{E_m}(Y_j) + i\xi_2\one_{E_n}(Y_j)}d\nu}^l \cdot e^{-\alpha s}\frac{(\alpha s)^l}{l!}
> $
> where
> $
> \begin{align*}
> \int e^{i\xi_1\one_{E_m}(Y_j) + i\xi_2\one_{E_n}(Y_j)}d\nu &= e^{i\xi_1} \nu(E_m) + e^{i\xi_2}\nu(E_n) + (1 - \nu(E_m) - \nu(E_n))
> \end{align*}
> $
> and the sum is the Taylor expansion for
> $
> \begin{align*}
> &\exp\braks{-\alpha s + \alpha s\braks{e^{i\xi_1} \nu(E_m) + e^{i\xi_2}\nu(E_n) + (1 - \nu(E_m) - \nu(E_n))}} \\
> &=
> \exp\braks{\alpha s\braks{(e^{i\xi_1} - 1) \nu(E_m) + (e^{i\xi_2} - 1)\nu(E_n)}}\\
> &=
> \exp\braks{s\braks{(e^{i\xi_1} - 1) M(E_m) + (e^{i\xi_2} - 1)M(E_n)}}
> \end{align*}
> $
> Combining with the earlier exponential yields
> $
> \begin{align*}
> &\exp\braks{(t - s)M(E_m)(e^{i\xi_1} - 1) + s\braks{(e^{i\xi_1} - 1) M(E_m) + (e^{i\xi_2} - 1)M(E_n)}} \\
> &= \exp\braks{tM(E_m)(e^{i\xi_1} - 1) + sM(E_n)(e^{i\xi_2} - 1)}
> \end{align*}
> $
> so the joint characteristic function is the product, and they are independent.
# Grouping and Decomposition
> [!theorem]
>
> Let $\bracs{M_k} \subset \mathfrak{M}_0$ be a family of mutually singular measures, and $\bracs{X^k}$ be mutually independent compound Poisson process associated with $\bracs{M_k}$, defined on the same probability space. If
> $
> X = \sum_{k = 1}^K X^k
> $
> then $X$ is a compound Poisson process associated with $\sum_{k = 1}^KM_k$. Moreover,
> $
> j(t, dy, X) = \sum_{k = 1}^Kj(t, dy, X^k)
> $
>
> *Proof*. From the sum, $X_0 = 0$ and $t \mapsto X_t$ is RCLL almost surely. Let $\xi \in \real^d$, then
> $
> \begin{align*}
> \ev\braks{e^{i\xi X_t}} &= \prod_{j = 1}^k \ev\braks{e^{i\xi X_t^k}} \\
> &= \exp\braks{\sum_{j = 1}^kt\int (e^{i\xi x} - 1)dM_k(x)}\\
> &= \exp\braks{t\int (e^{i\xi x} - 1)dM(x)}
> \end{align*}
> $
> On the other hand,
> $
> \begin{align*}
> j(t, B, X) &= \sum_{s \in J(t, X)}\one_B(X_s - X_t) \\
> &= \sum_{s \in J(t, X)}\sum_{k = 1}^K\one_{B \cap \supp{M_k}}(X_s - X_t)\\
> &= \sum_{s \in J(t, X)}\sum_{k = 1}^K\one_{B \cap \supp{M_k}}(X_s^k - X_t^k) \\
> &= \sum_{k = 1}^Kj_k(t, B, X)
> \end{align*}
> $
> [!theorem]
>
> Let $\bracs{M_k} \subset \mathfrak{M}_0$ be a family of mutually singular measures, and $M = \sum_{k = 1}^KM_k$. Let $X$ be a compound Poisson process associated with $M$. Then there exists a family $\bracs{X^k}$ of independent compound Poisson processes associated with $\bracs{M_k}$, and $X = \sum_{k = 1}^KX^k$.
>
> *Proof*. Let $j(t, dy, X)$ be the jump measure corresponding to $X$, then for any $t \ge 0$ and $w \in \omega$,
> $
> X_t(\omega) = \int yj(t, dy, X(\omega)) = \sum_{k = 1}^{K}\int y[j(t, dy, X(\omega))|_{\supp{M_k}}]
> $
> Let
> $
> X^k_t(\omega) = \int_{\supp{M_k}} y j(t, dy, X(\omega))
> $
> From here, suppose that $0 \le r \le s \le t$ and $\xi_1, \xi_2 \in \real^d$, then
> $
> \begin{align*}
> &\ev\braks{e^{i\xi_1 X_r^{k}}e^{i\xi_2(X_t^k - X_s^k)}} \\
> &= \sum_{m \in \nat}\sum_{n \in \nat}\ev\braks{e^{i\xi_1 X_r^{k}}e^{i\xi_2(X_t^k - X_s^k)}; N_r = m, N_{t} - N_s = n}
> \end{align*}
> $
> where
> $
> \begin{align*}
> e^{i\xi_1 X_r^k} &= e^{i\xi_1 \sum_{p = 1}^mY_p\one_{\Gamma_k}(Y_p)} \\
> e^{i\xi_2(X_t^k - X_s^k)} &= e^{i\xi_2\sum_{q = 1}^nY_q\one_{\Gamma_k}(Y_q)}
> \end{align*}
> $
> which allows writing the summand as
> $
> \braks{\int e^{i\xi_1 y} + 1 - \nu(\Gamma_k)}^m \cdot \braks{\int e^{i\xi_2 y} + 1 - \nu(\Gamma_k)}^n
> $
> After factoring the restrictions, the sum is equal to
> $
> \exp\braks{r\int (e^{i\xi y} - 1)dM_k(y)} \cdot \exp\braks{(t - s)\int (e^{i\xi y} - 1)dM_k(y)}
> $