> [!definition]
>
> $
> S(n) = X_1 + X_2 + \cdots X_n
> $
> A [[Set|set]] $X_1, X_2, \cdots, X_n$ of [[Probabilistic Independence|independent]] and identically distributed (drawn from the same [[Population|population]]/[[Probability Distribution|probability distribution]]) [[Random Variable|random variables]] constitute a random [[Sample|sample]] of size $n$ for that population.
>
> Suppose that each [[Expectation|expected value]] $\ev(X_j) = \mu$ and [[Variance|variance]] $\var{X_j} = \sigma^2(1 \le j \le n)$, then $\ev{(S(n))} = n\mu$ and $\var{S(N)} = n\sigma^2$.
> [!theorem]
>
> If $X_1$, $X_2$, $\cdots$, $X_n$ are independent identically distributed random variables with a common [[Moment Generating Function|moment generating function]] $M(t)$,
> $
> M_{S(n)}(t) = \paren{M(t)}^n
> $
>
> *Proof*. By induction.
>
> $M_{S(1)} = M(t)$.
>
> If $M_{S(n)}(t) = (M(t))^n$, then $M_{S(n + 1)}(t) = (M(t))^n \cdot M(t)$ (see [[Probabilistic Independence|independence]]).