> [!definition]
>
> Let $\cx = \prod_{k = 1}^{n}\cx_k$ be a product of [[Banach Space|Banach spaces]], $U_i \in \cx_i$ be [[Open Set|open]] for each $i$, $\cy$ be a Banach space and let
> $
> f: \prod_{k = 1}^{n}U_k \to \cy
> $
> be a [[Function|function]]. Let $(x_1, \cdots, x_n) \in \cx$ and define a *partial map*
> $
> x_i \mapsto f(x_1, \cdots, x_i, \cdots, x_n)
> $
> with $x_j$ fixed for all $j \ne i$. If this map is [[Derivative|differentiable]], then its derivative is the **partial derivative** of $f$, and denote it by $D_if(x)$ at the point $x$. If it exists, then
> $
> D_if(x) = \lambda: \cx_i \to \cy
> $
> is the unique [[Bounded Linear Map|bounded linear map]] $\lambda \in L(\cx_i, \cy)$ such that
> $
> f(x_1, \cdots, x_i + h, \cdots, x_n) - f(x_1, \cdots, x_n) = \lambda h + o(h)
> $
> [!theorem]
>
> Let $\cx_1, \cx_2, \cy$ be Banach spaces, $\cx = \cx_1 \times \cx_2$, $U_1 \subset \cx_1$ and $U_2 \subset \cx_2$ be open sets, and $f: U_1 \times U_2 \to \cy$. Then $f \in C^p$ if and only if $D_1f, D_2f \in C^{p - 1}$, in which case,
> $
> Df(x)(v_1, v_2) = D_1f(x)v_1 + D_2f(x)v_2
> $
> *Proof*. Let $(x, y) \in U_1 \times U_2$, and $h = (h_1, h_2) \in \cx$. Suppose that the partial derivatives exist and are continuous, then by the [[Mean Value Theorem|mean value theorem]]
> $
> \begin{align*}
> &f(x + h_1, y + h_2) - f(x, y) \\
> &= f(x + h_1, y + h_2) - f(x + h_1, y) \\
> &+ f(x + h_1, y) - f(x, y) \\
> &= \int_0^1D_2f(x + h_1, y + th_2)h_2dt \\
> &+ \int_0^1D_1 f(x + th_1, y)h_1 dt
> \end{align*}
> $
> where we can create an error term
> $
> \psi(h_1, th_2) = D_2f(x + h, y + th_2) - D_2f(x, y)
> $
> Since $D_2f$ is continuous, $\lim_{h \to 0}\psi(h_1, th_2) = 0$. Rewrite the first term as
> $
> \begin{align*}
> &\int_0^1D_2f(x + h_2, y + th_2)h_2dt \\
> &= \int_0^1 D_2f(x, y)h_2dt + \int_0^1\psi(h_1, th_2)h_2dt
> \end{align*}
> $
> where the error term
> $
> \begin{align*}
> \abs{\int_0^1\psi(h_1, th_2)h_2dt} &\le \underbrace{\sup_{t \in [0, 1]}\abs{\psi(h_1, th_2)}}_{\to 0} \cdot \abs{h_2} = o(h)
> \end{align*}
> $
> is little-oh of $h$. Using the same technique on the second part and combining the results yields
> $
> f(x + h_1, y + h_2) - f(x, y) = D_1f(x, y)h_1 + D_2f(x, y)h_2 + o(h)
> $
> where
> $
> Df(x, y)(h_1, h_2) = D_1f(x, y)h_1 + D_2f(x, y)h_2
> $
> If each partial is of class [$C^p$](Space%20of%20Continuously%20Differentiable%20Functions), then $Df$ is also of class $C^p$, making $f$ of class $C^{p + 1}$.
> [!theorem] Partial Derivatives Commute
>
> Let $U \subset \cx_1 \times \cx_2$ be open, $f: U \to \cy$ be a map such that $D_1f$, $D_2f$, $D_1D_2f$, $D_2D_1f$ exist and are all continuous. Then $D_1D_2f = D_2D_1f$.
>
> *Proof*. Let $h = (h_1, h_2) \in \cx_1 \times \cx_2$, and consider the difference
> $
> \Delta = f(x + h_1, y + h_2) - f(x + h, y) - f(x, y + h) + f(x, y)
> $
> where if
> $
> g(x) = f(x, y + h_2) - f(x, y)
> $
> then
> $
> \Delta = g(x + h_1) - g(x)
> $
> By the mean value theorem,
> $
> g(x + h_1) - g(x) = \int_0^1 Dg(x + th_1)h_1dt
> $
> where
> $
> \begin{align*}
> Dg(x + th_1) \cdot h_1 &= Df(x + th_1, y + h_2) \cdot h_1 - Df(x + th_1, y) \cdot h_1 \\
> &= D_1f(x + th_1, y + h_2) \cdot h_1 - D_1f(x + th_1, y) \cdot h_1 \\
> &= \braks{D_1f(x + th_1, y + h_2) - D_1f(x + th_1, y)} \cdot h_1
> \end{align*}
> $
> Using the fact that $D_1 f \in C^1$, we can apply the mean value theorem again
> $
> \begin{align*}
> &D_1f(x + th_1, y + h_2) - D_1f(x + th_1, y) \\
> &= \int_0^1 D \cdot D_1f(x + th_1, y + th_2) \cdot h_2 dt \\
> &= \int_0^1 D_2 D_1f(x + th_1, y + th_2) \cdot h_2dt
> \end{align*}
> $
> In other words,
> $
> \begin{align*}
> g(x + h_1) - g(x) &= \int_0^1 Dg(x + th_1)h_1dt \\
> &= \int_0^1\braks{\int_0^1D_2D_1f(x + th_1, y + sh_2)h_2ds}h_1dt \\
> &= \int_0^1\int_0^1 D_2D_1f(x + th_1, y + sh_2)(h_2, h_1)dsdt
> \end{align*}
> $
> Since $D_2D_1f$ is continuous, we can again bound the error term
> $
> \psi(th_1, sh_2) = D_2D_1f(x + th_1, y + sh_2) - D_2D_1f(x, y)
> $
> which has the property $\lim_{h \to 0}\psi(h_1, h_2) = 0$. Integrating the error yields
> $
> \begin{align*}
> \abs{\int_0^1 \psi(th_1, sh_2)h_2ds} &\le \abs{h_2} \sup_{s \in [0, 1]}\abs{\psi(th_1, sh_2)} \\
> \abs{\int_0^1\int_0^1 \psi(th_1, sh_2)(h_2, h_1)dsdt} &\le \abs{h_1} \cdot \abs{h_2} \sup_{s, t}\abs{\psi(th_1, sh_2)}
> \end{align*}
> $
> Finally, this gives
> $
> \Delta = D_2D_1(x, y)(h_2, h_1) + \int_0^1\int_0^1 \psi(th_1, sh_2)(h_2, h_1)ds
> $
> Repeating all of the above also yields
> $
> \Delta = D_1D_2(x, y)(h_1, h_2) + \int_0^1\int_0^1 \psi'(th_1, sh_2)(h_1, h_2)ds
> $
> where the difference between the two derivatives is the zero map due to the error integral being "little-oh".
# Difference Quotient
> [!theorem]
>
> Let $U \subset \real^d$ be an open subset, $\cy$ be a Banach space, and $f: U \to \cy$ be a map of class $C^1$, then for any $x \in U$,
> $
> \frac{f(x + he_j) - f(x)}{h} \to \partial^jf(x)
> $
> [[Uniform Convergence on Compact Sets|uniformly on compact sets]].
>
> *Proof*. Let $K$ be compact, then $K' = K + \bracs{te_j: t \in (-1, 1)}$ is compact as well. Since $f \in C^1$, $\partial^j f|_{K'}$ is uniformly continuous. Let $\eps > 0$, then there exists $\delta > 0$ such that $\norm{\tau_x \partial^j f - \partial^j f}_u < \eps$ for all $x$ with $\norm{x} < \delta$. In which case, by the mean value theorem
> $
> \limsup_{h \to 0}\frac{f(x + he_j) - f(x)}{h} \le \partial^jf(x) + \eps
> $
> and
> $
> \liminf_{h \to 0}\frac{f(x + he_j) - f(x)}{h} \ge \partial^jf(x) - \eps
> $
> as the above holds for all $\eps > 0$, and the choice of $\delta$ does not depend on $x$, the convergence is uniform.