> [!definition]
>
> The binomial [[Random Variable|random variable]] $B(n, p)$ represents the number of successes $x$ with [[Probability|probability]] $p$ in $n$ [[Bernoulli Process|Bernoulli trials]] (or a [[Random Sample|random sample]] of [[Bernoulli Random Variable|Bernoulli random variables]]). It has the following [[Probability Mass Function|probability mass function]]:
> $
> p_{B(n, p)} = {n \choose x}p^{x}(1 - p)^{n - x}
> $
Reducing over the [[Binomial Theorem|Binomial Theorem]], the total probability add back to 1.
> [!theorem] Derivation
>
> Consider the [[Probability|probability]] of getting a particular ordered outcome $P(sfsf)$. Note that by [[Conditional Probability|conditional probability]] and the property of [[Probabilistic Independence|independent]] variables ($P(A|B) = P(A)$), $P(sfsf) = P(sfs|f)P(f)$, $P(sfs) = P(sf|s)P(s)$, $P(sf) = P(s|f)P(f)$, which means that $P(sfsf) = P(s)P(f)P(s)P(f) = P(s)^2 P(f)^2$.
>
> Note that the probability of getting any ordered [[Math/Collections/List|list]] of $x$ successes and $(n - 1)$ failures is equal to $P(s)^x P(f)^{n-x}$. To calculate the probability of getting a [[Set|set]] (ignore order) of the same composition, multiply this probability by the number of [[Combination|combinations]] of the positions of success, ${n \choose x}$, to obtain $p(x) = {n \choose x}P(s)^x P(f)^{n-x}$, or $p(x) = {n \choose x}p^x (1 - p)^{n-x}$.
> [!theorem]
>
> $
> \ev(B(n, p)) = np \quad \var{B(n, p)} = npq
> $
> *Proof.*
>
> The [[Expectation|expected value]] of a binomial distribution is equivalent to drawing a [[Random Sample|random sample]], where $\mu = nE = np$.
>
> The [[Variance|variance]] of the [[Bernoulli Random Variable|Bernoulli distribution]] is $E(X^2) - \mu^2 = p - p^2 = p(1-p)$. Since the Bernoulli process draws [[Random Sample|random samples]] from the Bernoulli distribution:
> $
> Var(\sum{X}) = nVar(X) = np(1 - p)
> $
>
> [!theorem] [[Moment Generating Function]]
>
> $
> M_X(t) = \paren{pe^{t} + q}^n
> $
> *Proof.* Using the binomial theorem:
> $
> \begin{align*}
> M_X(t) &= \ev(e^{tX}) \\
> &= \sum_{j = 0}^{n}e^{tx_j}p_j \\
> &= \sum_{j = 0}^{n}e^{tj}p_j \\
> &= \sum_{j = 0}^{n}e^{tj}{n \choose j}p^{j}q^{n - j} \\
> &= \sum_{j = 0}^{n}{n \choose j}e^{tj}p^{j}q^{n - j} \\
> &= \sum_{j = 0}^{n}{n \choose j}\paren{pe^{t}}^jq^{n - j} \\
> &= (pe^{t} + q)^{n}
> \end{align*}
> $