> [!definition]
>
> Let $(\Omega, \cm, P)$ be a probability space and $\bracs{A_i}_{i \in I}$ be a collection of events. Then $\bracs{A_i}_{i \in I}$ are **mutually independent** if
> $
> P\paren{\bigcap_{j \in J}A_j} = \prod_{j \in J}P(A_j) \quad \forall J \subseteq I, J\ \text{finite}
> $
> the probability of any event occurring given a (finite) combination of any other event is equal to the probability of it occurring by itself.
>
> The events are $k$-wise independent if the above holds only for subsets of size $k$ or less, and are *pairwise* independent if $k = 2$.
> [!definition]
>
> Let $(\Omega, \cf, \bp)$ be a probability space and $\seqi{G} \subset \pow{\cf}$ be a family of collections of events. The family $\seqi{G}$ is **independent** if every combination $f \in \prod_{i \in I}G_i$ is *mutually independent*.
> [!theorem]
>
> Let $(\Omega, \cf, \bp)$ be a probability space and $\mathcal{P}, \mathcal{Q}$ be independent $\pi$-systems in $\cf$, then $\sigma(\mathcal{P})$ and $\sigma(\mathcal{Q})$ are independent.
>
> *Proof*. Let $A \in \mathcal{P}$ and define the measures $\mu_A, \bp_A$ on $\sigma(\mathcal{Q})$ by
> $
> \mu_A(B) = \bp\bracs{A} \cdot \bp\bracs{B} \quad \bp_A(B) = \bp\bracs{A \cap B}
> $
> then $\mu_A(B) = \bp_A(B)$ for all $B \in \mathcal{Q}$ and $\mu_A(\Omega) = \bp\bracs{A} = \bp_A(\Omega)$. By [[Dynkin's Uniqueness Theorem]], $\mu_A = \bp_A$, and
> $
> P\bracs{A \cap B} = \bp\bracs{A} \cdot \bp\bracs{B}
> $
> for all $A \in \mathcal{P}$ and $B \in \sigma(\mathcal{Q})$.
>
> Now for any $B \in \sigma(\mathcal{Q})$, define $\nu^B$ and $\bp^B$ on $\sigma(\mathcal{P})$ by
> $
> \nu^B(A) = \bp\bracs{A} \cdot \bp\bracs{B} \quad \bp^B(A) = \bp\bracs{A \cap B}
> $
> then $\nu^B = \bp^B$ on $\mathcal{P}$ and $\nu^B(\Omega) = \bp\bracs{B} = \bp^B(\Omega)$. By Dynkin's Uniqueness Theorem again, $\nu^B = \bp^B$ on $\sigma(\mathcal{P})$, and we have the independence as desired.
> [!definition]
>
> Let $(\Omega, \cf, \bp)$ be a probability space. A family of [[Random Variable|random variables]] $\seqi{X}$ over this space are **mutually independent** if their generated $\sigma$-fields are mutually independent. Specifically, for any $J \subset I$ finite, and Borel sets $(B_j, j \in J)$, we have
> $
> \bp\bracs{X_j \in B_j, j \in J} = \prod_{j \in J}\bp\bracs{X_j \in B_j}
> $
> [!theorem]
>
> Let $X, Y$ be independent random variables with distributions $\mu$ and $\nu$ respectively, then $X + Y$ has distribution [$\mu * \nu$](Convolution%20of%20Measures).
>
> *Proof*. Let $A \times B$ be the product of a $\mu$-measurable and $\nu$-measurable set, then from independence,
> $
> \bp\bracs{(X, Y) \in A \times B} = \mu(A) \times \nu(B) = \mu \times \nu(A \times B)
> $
> By [[Dynkin's Uniqueness Theorem]], $\bp\bracs{(X, Y) \in E} = \mu \times \nu(E)$ for any $E$ in the product $\sigma$-field.
>
> Let $A \in \cb(\real)$ be a Borel set, then by the [[Fubini-Tonelli Theorem]],
> $
> \begin{align*}
> \mu_{X + Y}(A) &= \bp\bracs{X + Y \in A} \\
> &= \int \chi_A (X + Y)d\bp \\
> &= \int \chi_A (x + y) d(\mu \times \nu)
> \end{align*}
> $
> [!theorem]
>
> If $X$ and $Y$ are independent, then the [[Expectation|expected value]] of their product can be separated:
> $
> \ev(XY) = \ev(X)\ev(Y)
> $
> Their [[Covariance|covariance]] and [[Pearson Correlation Coefficient|correlation coefficient]] would be 0:
> $
> \cov{X, Y} = \rho(X, Y) = 0
> $
> Their [[Variance|variance]] of their sum can be separated:
> $
> \var{X + Y} = \var{X} + \var{Y}
> $
> *Proof*.
> $
> \begin{align*}
> \ev{(XY)} &= \sum_{j = 1}^{n}\sum_{k = 1}^{m}x_j y_k p{jk} \\
> &= \sum_{j = 1}^{n}\sum_{k = 1}^{m}x_j y_k p_j q_k \\
> &= \sum_{j = 1}^{n}x_jp_j\sum_{k = 1}^{m}y_kq_k\\
> &= \sum_{j = 1}^{n}x_jp_j \cdot \sum_{k = 1}^{m}y_kq_k \\
> &= \ev{(X)} \ev(Y)
> \end{align*}
> $
>
> Since $\cov{X, Y} = \ev{(XY)} - \ev(X)\ev(Y)$ and $\ev(XY) = \ev{X}\ev(Y)$, $\cov{X, Y} = 0$, and as a result, $\rho(X, Y) = 0$.
>
> Since $\var{X, Y} = \var{X} + 2\cov{X, Y} + \var{Y}$ and $\cov{X, Y} = 0$, $\var{X + Y} = \var{X} + \var{Y}$.
> [!theorem]
>
> If $X$ and $Y$ are independent, then the [[Moment Generating Function|moment generating function]] of their sum is the product of their moment generating function:
> $
> M_{X + Y}(t) =M_X(t)M_Y(t)
> $
>
> *Proof.* Using the previous theorem.
>
> $
> M_{X + Y}(t) = \ev\paren{e^{t(X + Y)}} = \ev\paren{e^{tX}e^{tY}}
> = \ev\paren{e^{tX}}\ev\paren{e^{tY}} = M_X(t)M_Y(t)
> $
[^1]: In suitable $\sigma$-algebras.
[^2]: In suitable $\sigma$-algebras.