> [!theorem]
>
> Let $\seqf{X_k}$ be a family of [[Random Variable|random variables]]. Then they are [[Probabilistic Independence|independent]] if and only if the [[Expectation|expectation]] can be pulled out of products
> $
> \ev\braks{\prod_{k \in [n]}f_k(X_k)} = \prod_{k \in [n]}\ev(f_k(X_k))
> $
> for any bounded [[Borel Measurable Function|Borel measurable]] functions $\seqf{f_k}$.
>
> The above equation is called the *factorisation formula*. Note that this theorem does not hold for the identity function, as it is unbounded.
>
> ### Product Implies Independence
>
> First suppose that the above holds for any bounded Borel-measurable functions. Let $\seqf{E_k} \subset \cf$ with each $E_k \in \sigma(X_k)$. Since $\sigma(X_k) = \bracs{X_k^{-1}(B), B \in \cb(\real)}$, we can write each $E_k = \bracs{X_k \in B_k}$ where $B_k \in \cb(\real)$. This allows pulling out the events from the expectations:
> $
> \begin{align*}
> \bp\bracs{\bigcap_{k \in [n]}E_k} &= \bp\bracs{\bigcap_{k \in [n]}\bracs{X_k \in B_k}} \\
> &= \ev\braks{\prod_{k \in [n]}\chi_{B_k}(X_k)} \\
> &= \prod_{k \in [n]}\ev(\chi_{B_k}(X_k)) \\
> &= \prod_{k \in [n]}\bp\bracs{X_k \in B_k} \\
> &= \prod_{k \in [n]}\bp\bracs{E_k}
> \end{align*}
> $
>
>
> ### Independence Implies Product
>
> Now suppose that the RVs are mutually independent. Let $S_n$ be the collection of bounded Borel-measurable functions such that
> $
> \ev\braks{f(X_n) \cdot \prod_{k = 1}^{n - 1}\chi_{\bracs{X_k \in B_k}}} = \ev(f(X_n)) \cdot \prod_{k = 1}^{n - 1}\bp\bracs{X_k \in B_k}
> $
> for all $\bracs{B_k}_1^{n - 1}$. By assumption, $\cs_n$ contains all Borel indicator functions. Moreover, this $\cs_n$ is a vector space by linearity of expectation, and is closed under monotone limits by [[Monotone Convergence Theorem|MCT]].
>
> By the [[Monotone Class Argument]], $S_n$ contains all bounded Borel-measurable functions.
>
> Now for any $0 \le k < n$, define $\cs_{n - k}$ to be the collection of all bounded Borel-measurable functions such that for any $f \in \cs_{n - k}$,
> $
> \ev\braks{f(X_{n - k}) \cdot \paren{\prod_{i = 1}^{k}g_i(X_{n - i})} \cdot \paren{\prod_{i = 1}^{n - k - 1}\chi_{\bracs{X_i \in B_i}}}}
> $
> is equal to
> $
> \ev(f(X_{n - k})) \cdot \paren{\prod_{i = 1}^{k}\ev(g_i(X_{n - i}))} \cdot \prod_{i = 1}^{n - k - 1}\bp\bracs{X_i \in B_i}
> $
> for any $\bracs{g_i}_1^k \subset L^+ \cap L^\infty$ and $\bracs{B_i}_1^{n - k - 1}$. By the same arguments, $\cs_{n - k}$ contains all bounded Borel-measurable functions if $\cs_{n - k + 1}$ contains all bounded Borel-measurable functions. We can extend this conclusion such that the equality holds for any bounded Borel-measurable functions $\bracs{g_i}_1^k \subset L^1 \cap L^\infty$ by linearity again.
> [!theorem]
>
> Let $\seqf{X_i}$ be mutually independent random variables. If any of the following holds:
> 1. $\seqf{X_i} \subset L^+$
> 2. $\seqf{X_i} \subset L^1$
>
> then
> $
> \ev\braks{\prod_{i = 1}^nX_i} = \prod_{i = 1}^n\ev(X_i)
> $
> *Proof*. Let $X, Y$ be independent random variables. First suppose that they are both non-negative, then restricting the image of the variables makes the identity function "bounded", and
> $
> \begin{align*}
> \ev(XY) &= \limv{n}\ev((X \cdot\chi_{\bracs{X \le n}}) \cdot (Y \cdot \chi_\bracs{Y \le n})) \\
> &= \limv{n}\ev(X \cdot\chi_{\bracs{X \le n}}) \cdot \ev(Y \cdot \chi_\bracs{Y \le n}) \\
> &= \ev(X) \cdot \ev(Y)
> \end{align*}
> $
> by [[Monotone Convergence Theorem|MCT]].
>
> Now suppose that $X, Y \in L_1$, then
> $
> \abs{XY} = (X^+ + X^-)(Y^+ + Y^-)
> $
> By the prior results,
> $
> \begin{align*}
> \ev(\abs{XY}) &= \ev(X^+Y^+) + \ev(X^+Y^-) + \ev(X^-Y^+) + \ev(X^-Y^-) \\
> &= \ev(X^+) \ev(Y^+) + \ev(X^+)\ev(Y^-) \\
> &+ \ev(X^-)\ev(Y^+) + \ev(X^-)\ev(Y^-) < \infty
> \end{align*}
> $
> so $XY \in L_1$. Spamming linearity yields
> $
> \begin{align*}
> \ev(XY) &= \ev((X^+ - X^-)(Y^+ - Y^-)) \\
> &= \ev(X^+Y^+) - \ev(X^+Y^-) - \ev(X^-Y^+) + \ev(X^-Y^-) \\
> &= \ev(X^+) \ev(Y^+) - \ev(X^+)\ev(Y^-) \\
> &- \ev(X^-)\ev(Y^+) + \ev(X^-)\ev(Y^-) \\
> &= \ev(X) \cdot \ev(Y)
> \end{align*}
> $