> [!theoremb] Theorem
>
> Let $X_1, X_2, \cdots$ be [[Probabilistic Independence|independent]], identically distributed [[Random Variable|random variables]] with common [[Mean|mean]] $\mu$ and [[Variance|variance]] $\sigma^2$. The let $S(n)$ be a [[Random Sample|random sample]] of $X$, then for any $-\infty \le a \lt b \le \infty$,
> $
> \limv{n}P\paren{
> a \le \frac{S(n) - \mu}{\sigma\sqrt{n}} \le b
> } =
> \frac{1}{\sqrt{2\pi}}\int_{a}^{b}e^{-\frac{1}{2}z^2}dz
> $
> The probability distribution of the standardised random sample is exactly equal to the [[Normal Distribution|standard normal distribution]].
>
> *Proof*. Instead of using the distribution directly, first examine their [[Moment Generating Function|moment generating function]] and then use the theorem that $M_X(t) = M_Y(t) \Rightarrow$ $X$ and $Y$ are identically distributed.
>
> First find the moment generating function of the standard normal distribution.
> $
> \begin{align*}
> M_X(t) &= \frac{1}{\sqrt{2\pi}}e^{\mu t}\int_{-\infty}^{\infty}e^{\sigma tz}\exp\paren{-\frac{1}{2}z^2}dz \\
> &= \frac{1}{\sqrt{2\pi}}e^{\mu t}\exp\paren{\mu t + \frac{1}{2}\sigma^2t}\int_{-\infty}^{\infty}e^{\sigma tz}\exp\paren{-\frac{1}{2}(z - \sigma t)^2}dz \\
> &= \exp\paren{\mu t + \frac{1}{2}\sigma^2 t^2} = \exp\paren{\frac{1}{2}t^2}
> \end{align*}
> $
>
> Consider the independent identically distributed random variables defined by:
> $
> T_j = X_j - \mu \quad j \in \nat, j \ge 1
> $
> Then, $\ev(T_j) = 0$ and $\ev(T_j^2) = \var{T_j} = \sigma^2$ and
> $
> Y(n) = \frac{1}{\sigma \sqrt{n}}\sum_{j = 1}^{n}T_j = \frac{S(n) - n\mu}{\sigma \sqrt{n}}
> $
> Using the fact that,
> $
> \begin{align*}
> M_{Y(n)}(t) &= \paren{M\paren{\frac{t}{\sigma \sqrt{n}}}}^n \quad \forall t \in \real
> \end{align*}
> $
> where $M$ is the common moment generating function for the $T$s.
> $
> M_X(t) = \sum_{n = 0}^{\infty}\frac{t^n}{n!}\ev(X^n)
> $
> $
> \begin{align*}
> M\paren{\frac{1}{\sigma\sqrt{n}}} &= 1 + 0\cdot\frac{t}{\sigma \sqrt{n}} + \frac{1}{2}\frac{t^2}{\sigma^2 n}\sigma^2
> + \sum_{m = 3}^{\infty}n^{-m/2}\frac{t^m}{m!}\frac{\ev(T_j^{m})}{\sigma^m} \\
> &= 1 + \frac{1}{2}\frac{t^2}{n} + \paren{
> \frac{1}{n}\times \sum_{m = 3}^{\infty}n^{-(m - 2)/2}\frac{t^m}{m!}\frac{\ev(T_{j}^{m})}{\sigma^m}
> }
> \end{align*}
> $
> But then,
> $
> \begin{align*}
> M_{Y(n)}(t) = \paren{1 + \frac{1}{2}\frac{t^2}{n} + \paren{
> \frac{1}{n}\times \sum_{m = 3}^{\infty}n^{-(m - 2)/2}\frac{t^m}{m!}\frac{\ev(T_{j}^{m})}{\sigma^m}
> }}^n
> \end{align*}
> $
> And
> $
> \begin{align*}
> \limv{n}M_{Y(n)}(t) &= \limv{n}\paren{1 + \frac{1}{2}\frac{t^2}{n} + \paren{
> \frac{1}{n}\times \sum_{m = 3}^{\infty}n^{-(m - 2)/2}\frac{t^m}{m!}\frac{\ev(T_{j}^{m})}{\sigma^m}
> }}^n \\
> &= \limv{n}\paren{1 + \frac{1}{2}\frac{t^2}{n}}^n \\
> &= e^{\frac{1}{2}t^2} = M_Z(t)
> \end{align*}
> $
> [!theorem]
>
> Let $\seq{X_n} \subset \bigcap_{p \ge 1}L^p$ be a family of independent, identically distributed random variables with $\ev(X_n) = 0$ and $\ev(X_n^2) = 1$. Let $\check{S}_n = S_n/\sqrt{n}$, and let $L_m = \limv{n}\ev(\check{S}_n^m)$. Then
> $
> L_{2k - 1} = 0 \quad L_{2k} = 1 \cdot 3 \cdot 5 \cdot (2k - 1) \quad k \in \nat
> $
> So if $Z \sim N(0, 1)$, then $\limv{n}\ev(\check S_n^m) = \ev(Z^m)$. If $\phi \in \real[x]$, then
> $
> \limv{n}\ev(\phi(\check S_n)) = \frac{1}{\sqrt{2\pi}}\int \phi(x)e^{-x^2/2}dx
> $
>
> *Proof, by induction on $m$.* Suppose that $L_k$ exists for all $k \le m$. Then over a binomial expansion
> $
> \begin{align*}
> \ev\braks{S_n^{m+1}} &= \sum_{j = 1}^{n}\ev(X_jS_n^m) \\
> &=n\ev(X_1S_n^m) \\
> &= n\ev\braks{X_1\paren{X_1 +\sum_{j =2}^n X_j}} \\
> &= n\sum_{k = 0}^m\paren{{m \choose k}X_1^{k + 1} \paren{\sum_{j = 2}^{n}X_j}^{m - k}}
> \end{align*}
> $
> By independence,
> $
> \begin{align*}
> \ev\braks{S_n^{m + 1}} &= \underbrace{n\ev(X_1)\ev(S_{n-1}^m)}_{0} + n\ev(X_1^2)\ev(S_{n - 1}^{m - 1}) + n\sum_{k = 2}^m\ev(X^{m +1})\ev(S_{n-1}^{m-k})
> \end{align*}
> $
> Dividing by $\sqrt{n}$ yields
> $
> \begin{align*}
> \ev\braks{\check{S}_n^{m + 1}} &= n^{-(m + 1)/2}\ev(S_n^{m + 1}) \\
> &= \frac{(n - 1)^{(m-1)/2}}{n^{(m-1)/2}}m\ev\braks{(\check S_{n - 1})^{m - 1}} \\
> &+ \sum_{k = 2}^m \frac{(n-1)^{(m-k)/2}}{n^{(m-1)/2}} \ev(X^{m +1})\ev(\check S_{n-1}^{m-k})
> \end{align*}
> $
> Taking the limit yields that $L_{m + 1} = mL_{m - 1}$.
> [!theorem]
>
> Let $\seq{X_n} \subset \bigcap_{p \ge 1}L^p$ be a family of independent, identically distributed random variables with $\ev(X_n^2) < \infty$. Let $\check{S}_n = (S_n - n\ev(X_1))/\sqrt{n\sigma^2}$, and let $L_m = \limv{n}\ev(\check{S}_n^m)$. Then
> $
> L_{2k - 1} = 0 \quad L_{2k} = 1 \cdot 3 \cdot 5 \cdot (2k - 1) \quad k \in \nat
> $
> So if $Z \sim N(0, 1)$, then $\limv{n}\ev(\check S_n^m) = \ev(Z^m)$. If $\phi \in \real[x]$ and $\td S_n = \sigma S_n$, then
> $
> \limv{n}\ev(\phi(\check S_n)) = \frac{1}{\sqrt{2\pi \sigma^2}}\int \phi(x)e^{-x^2/2\sigma^2}dx
> $
> [!theorem]
>
> Let $\seq{X_m}$ be a family of independent, identically distributed $\real^d$-valued random variables on $(\Omega, \cf, \bp)$ such that
> 1. $\ev(X_n) = m \in \real^d$.
> 2. $C = (C_{i, j})$ is the covariance matrix of $X$.
>
> Denote
> - $S_n = \sum_{k \le n}X_k$
> - $\check S_n = (S_n - nm)/\sqrt{n}$
> - $\mu_n$ as the distribution of $\check S_n$.
>
> Then $\mu_n$ converges to $\gamma_{0, C}$ [[Weak Convergence of Measures|weakly]].
>
>
> *Proof*. Let $\wh \mu_n$ be the [[Characteristic Function|characteristic function]] of $\mu_n$, then it's sufficient to show that $\wh \mu_n \to e^{-\xi C \xi/2}$ pointwise.
> $
> \begin{align*}
> \wh \mu_n(\xi) &= \int \exp\paren{i \angles{\xi, x}}d\mu_n \\
> &= \ev\braks{e^{i(X_1, \xi/\sqrt{n})}}^n
> \end{align*}
> $
> Here, let $\rho$ be the distribution of $X_1$, $\nabla \wh \rho(0) = 0$ and $-\nabla^2\rho(0) = C$. Via a Taylor expansion,
> $
> \begin{align*}
> \hat \rho(\xi/\sqrt{n})^n &= 1 + \nabla \wh \rho(0) \cdot \xi/\sqrt{n} + \frac{1}{2}\paren{\frac{\xi}{\sqrt{n}}\nabla^2\wh \rho(0) \frac{\xi}{\sqrt{n}}}\\
> &= \braks{1 - \frac{1}{2n}\xi C \xi + o(1/n)}^n \to e^{-\xi C \xi/2}
> \end{align*}
> $
[^1]: Or ones as small as 5 if the distribution is symmetric, unimodal, and continuous.