> [!definition]
>
> A **multi-index** is an ordered $n$-tuple of non-negative integers. If $\alpha = (\alpha_1, \cdots, \alpha_n)$, then we set
> $
> \abs{\alpha} = \sum_{k \in [n]}\alpha_k \quad \alpha! = \prod_{k \in [n]}\alpha_k!
> $
> and
> $
> \partial^\alpha = \prod_{k \in [n]}\paren{\frac{\partial}{\partial x_{k}}}^{\alpha_k}
> $
> If $x = (x_1, \cdots, x_n) \in \real^n$, then denote
> $
> x^\alpha = \prod_{k \in [n]}x_k^{\alpha_k}
> $
> [!theorem] Product Rule
>
> Let $f, g$ be [[Function|functions]] and $\alpha$ be a multi-index, then
> $
> \partial^\alpha(fg) = \sum_{\beta + \gamma = \alpha}\frac{\alpha!}{\beta!\gamma!}(\partial^\beta f)(\partial^\gamma g)
> $
> ### Induction Argument
>
> Let $f, g \in C^{\text{enough}}(\real^n)$ be functions and $k \in \nat$. If
> $
> \partial^\alpha(fg) = \sum_{\beta + \gamma = \alpha}\frac{\alpha!}{\gamma!\delta!}(\partial^\gamma f)(\partial^\delta g) \quad \forall \alpha: \abs{\alpha} = k
> $
> then the same claim holds for all $\beta: \abs{\beta} = k + 1$.
>
> *Proof*. Denote $e_j$ as the vector with $1$ on its $j$-th entry and $0$s everywhere else. If $\abs{\beta} = k + 1$, then there exists $j$ and $\alpha: \abs{\alpha} = k$ such that $\beta = \alpha + e_j$, and $\partial^\beta = \frac{\partial}{\partial x_j}\partial^\alpha$.
>
> Use the induction hypothesis and push the partial derivative in
> $
> \begin{align*}
> \partial^\beta(fg) &= \frac{\partial }{\partial x_j}\partial^\alpha(fg) \\
> &= \frac{\partial }{\partial x_j}\sum_{\gamma + \delta = \alpha}\frac{\alpha!}{\gamma!\delta!}(\partial^\gamma f)(\partial^\delta g) \\
> &= \sum_{\gamma + \delta = \alpha}\frac{\alpha!}{\gamma!\delta!} \cdot \frac{\partial }{\partial x_j}(\partial^\gamma f)(\partial^\delta g)
> \end{align*}
> $
> where
> $
> \frac{\partial }{\partial x_j}(\partial^\gamma f)(\partial^\delta g) = (\partial^{\gamma + e_j} f)(\partial^\delta g) + (\partial^{\gamma}f)(\partial^{\delta + e_j})
> $
> This flourishing of terms leads to some duplicates between the left and right terms. In particular, when $\gamma + e_j = \gamma'$ (and $\delta = \delta' + e_j$). We can now regroup the terms[^1] as
> $
> \begin{align*}
> &\sum_{\gamma + \delta = \alpha}\frac{\alpha!}{\gamma!\delta!} \cdot \frac{\partial }{\partial x_j}(\partial^\gamma f)(\partial^\delta g) \\
> &= \sum_{\gamma + \delta = \alpha}\frac{\alpha!}{\gamma!\delta!}(\partial^{\gamma + e_j} f)(\partial^\delta g) + \sum_{\gamma + \delta = \alpha}\frac{\alpha!}{\gamma!\delta!}(\partial^{\gamma}f)(\partial^{\delta + e_j}) \\
> &= \sum_{\gamma + \delta = \beta}\frac{\alpha!}{\gamma!\delta!}(\partial^{\gamma} f)(\partial^\delta g)
> \end{align*}
> $
[^1]: [[Multi-Index Binomial Detail]]