> [!theorem] > > Let $M \in \mathfrak{M}_2(\real^d) \setminus \mathfrak{M}_1(\real^d)$, and $j(t, dy, \omega)$ be the [[Jump Measure|jump measure]] driven by $M$. Let > $ > V_t(\omega) = \int_{\real^d}\abs{y}j(t, dy,\omega) > $ > then for almost every $\omega$, > $ > V_t(\omega) = \infty \quad \forall t > 0 > $ > *Proof*. Let $A_0 = \ol{B(0, 1)}^c$ and $A_k = \ol{B(0, 2^{k + 1})}$, $M_k = M|_{A_k}$, and $X^{k}$ be independent compound Poisson processes for $\pi_{0, 0, M_k}$, then > $ > V_t(\omega) = \int_{\real^d}\abs{y}j(t, dy, \omega) = \sum_{k = 0}^\infty\int \abs{y}j(t, dy, X^{(k)}(\omega)) > $ > and $V^{(k)}_t = \int_{\real^d}\abs{y}j(t, dy, X^{(k)}(\omega))$. For each $k$, > $ > \begin{align*} > \var{V_t^{(k)}} &= \ev\braks{\paren{V_t^{(k)}}^2} - \paren{\ev\braks{V_t^{(k)}}}^2 > \end{align*} > $ > Write $X_t^{(k)} = \sum_{p = 1}^{N_t}Y_p$, so > $ > \begin{align*} > \ev\braks{\paren{V_t^{(k)}}^2} &= \ev\braks{\paren{\sum_{p = 1}^{N_t}\abs{Y_p}}^2} \\ > \ev\braks{\paren{V_t^{(k)}}}^2 &= \ev\braks{\paren{\sum_{p = 1}^{N_t}\abs{Y_p}}}^2 > \end{align*} > $ > so the difference becomes > $ > \begin{align*} > &\ev\braks{N_t^2 - N_t}\norm{Y_1}_2^2 + > \ev\braks{N_t}\norm{Y_1}_2^2 -\paren{\ev[N_t]\ev[Y_1]}^2 \\ > &= \alpha^2t^2\int \abs{y}^2d\nu(y) + \alpha t \int\abs{y}^2d\nu(y) - \alpha^2t^2 \int \abs{y}^2d\nu(y) \\ > &= t\int\abs{y}^2dM_k(y) > \end{align*} > $ > Since the processes are independent, > $ > \var{V_t} = \sum_{k \in \nat}\var{V_t^{(k)}} = t \int\abs{y}^2 dM_k(y) > $ > This implies that the sum of $V_t^{(k)} - \ev[V_t^{(k)}]$ converges almost surely. Now, > $ > \begin{align*} > &\limv{k}\braks{\sum_{j = 1}^kV_t^{(k)} - \sum_{j = 1}^k\ev\braks{V_t^{(j)}}} \\ > &= \limv{k}\braks{\int_{\ol{B(0, 1)} \setminus \ol{B(0, 2^{-k})}}\abs{y}j(t, dy) - \int_{\ol{B(0, 1)} \setminus \ol{B(0, 2^{-k})}}\abs{y}dM(y)} > \end{align*} > $ > since the latter term goes to $\infty$, the variation must be $\infty$ almost surely. > [!theorem] > > Let $M \in \mathfrak{M}_2 \setminus \mathfrak{M}_1$, $t \ge 0$, and $\omega \in \Omega$. Let $j(t, dy, \omega)$ be the jump measure driven by $M$. For each $r > 0$, define > $ > X_0^{(r)} = 0 \quad X_t^{(r)} = \int_{\real^d \setminus \ol{B(0, r)}}yj(t, dy) - t\int_{\ol{B(0, 1)} \setminus \ol{B(0, r)}}ydM(y) > $ > then the limit $X_t = X^{(r)}_t$ as $r \to 0$ exists for each $t \ge 0$, and > > *Proof*. For each $r$, the restriction $M^{(r)}|_{\ol{B(0, r)}^c}$ is of class $\mathfrak{M}_0$ with $X^{(r)}$ being a Lévy process. > > Let $0 < r' < r$, and let $Y_t = X^{(r)}_t - X^{(r')}_t$, which is equal to > $ > Y_t = \int_{\ol{B(0, r)} \setminus \ol{B(0, r')}}yj(t, dy) - t\int_{\ol{B(0, r)} \setminus \ol{B(0, r')}} y dM(y) > $ > Since $Y_t$ is a difference of Lévy processes, it has RCLL paths, which allows evaluating the uniform norm on dyadic rational numbers as > $ > \norm{Y}_{u, [0, t]} = \lim_{n \to \infty}\max_{0 \le m \le 2^n}\abs{Y_{m2^{-n}t}} > $ > Thus for any $\eps > 0$, using independent and homogeneous increments, > $ > \begin{align*} > \bp\bracs{\norm{Y}_{u, [0, t]} > \eps} &= \limv{n}\bp\bracs{\max_{0 \le m \le 2^n}\abs{Y_{m2^{-n}t}} > \eps} \\ > &\le \limv{n}\sum_{j = 1}^d\bp\bracs{\max_{0 \le m \le 2^n}\abs{\pi_j \circ Y_{m2^{-n}t}} > \eps/\sqrt{d}} \\ > &\le \limv{n}\sum_{j = 1}^d\bp\bracs{\max_{0 \le m \le 2^n}\abs{\pi_j \circ \sum_{l = 1}^m(Y_{l2^{-n}t} - Y_{(l-1)2^{-n}t})} > \eps/\sqrt{d}} > \end{align*} > $ > Using Kolmogorov's maximal inequality, each probability is bounded by $\frac{d}{\eps^2}\ev\braks{\pi_j \circ Y_t}$, so > $ > \bp\bracs{\norm{Y}_{u, [0, t]} > \eps} \le \frac{d}{\eps^2}\ev\braks{Y_t^2} \le \frac{d}{\eps^2}\int_{\ol{B(0, r)} \setminus \ol{B(0, r')}}\abs{y}^2dM(y) > $ > since $M \in \mathfrak{M}_2(\real^d)$, there exists $\seq{r_n}$ with $r_n \downto 0$ and $\int_{\ol{B(0, r_n)} \setminus \ol{B(0, r_{n + 1})}}\abs{y}^{2}dM(y) < 2^n$. From here, > $ > \bp\bracs{\norm{Y}_{u, [0, t]} > \eps} \le \frac{d}{\eps^2}2^{-n} > $ > Let $\eps = 2^{-n/2}$, then > $ > \bp\bracs{\norm{X^{(r_n)} - X^{(r_{n + 1})}}_{u, [0, t]} > 2^{-n/4} \quad i.o.} = 0 > $ > Therefore the sequence is Cauchy with respect to the uniform norm, and converges on bounded time intervals. Since each $X^{(r_n)}$ is RCLL, the limit is also RCLL. > [!theorem] > > Let $M$ and $X$ be as above, then $X_t$ is a [[Lévy Process]] associated with $\pi_{0, 0, M}$. > > *Proof*. Independence and homogenous increments obtained by working through the joint characteristic functions.