> [!definition]
>
> Let $a \in \real^n$ be a point in the [[Euclidean Space]]. A [[Derivation|derivation]] at $a$ is a [[Linear Functional|linear functional]] $\omega: C^{\infty}(\real^n) \to \real$ such that
> $
> \omega (fg) = f(a) \cdot \omega(g) + g(a) \cdot \omega(f)
> $
> [!theorem]
>
> Let $a \in \real^n$, $\omega$ be a derivation at $a$, and $f, g \in C^\infty(\real^n)$, then
> 1. If $f$ is constant, then $\omega f = 0$.
> 2. If $f(a) = g(a) = 0$, then $\omega(fg) = 0$.
>
> *Proof*. Suppose that $f(x) = 1$, then
> $
> \omega(f) = \omega (f \cdot f) = f(a) \omega(f) + f(a)\omega(f) = 2\omega (f)
> $
> So $\omega(f) = 0$. By linearity, this holds for all constant functions.
> [!theorem]
>
> Let $a \in \real^n$, then
> - For each concrete tangent vector $v_a \in \real_a^n$, the [[Directional Derivative|directional derivative]] $D_v|_a: C^\infty(\real^n) \to \real$ is a derivation at $a$.
> - The map $v_a \mapsto D_v|_a$ is an [[Isomorphism|isomorphism]] from $\real_a^n$ to the space of derivations at $a$, which can now be referred to as $T_a\real^n$, the [[Tangent Space|tangent space]].
>
> *Proof*. First claim comes from the product rule.
>
> ### Injective
>
> Let $v_a \in \real^n$ such that $D_v|_a = 0$. Let $v_a = \sum_{i = 1}^nc_ie_i|_a$ be its representation in the standard basis, and let $\pi_i$ be the $i$-th projection map, then since $D_v|_a(\pi_i) = 0$ for all $i$, $c_i = 0$ for all $i$, and $v_a = 0$. Therefore the map is injective.
>
> ### Surjective
>
> Let $w \in T_a\real^n$ be any vector and $f \in C^1$, then by [[Taylor's Formula]],
> $
> \begin{align*}
> f(a + h) &= f(a) + Df(a)(h) + \int_0^1(1 - t)D^2f(a + th)h^{(2)}dt
> \end{align*}
> $
> where in $\real^n$, the bilinear map from the integral remainder acting on $h^{(2)}$ can just be expressed as
> $
> \begin{align*}
> &\int_0^1(1 - t)D^2f(x + th)h^{(2)}dt \\
> &= \sum_{i \in [n]}\sum_{j \in [n]}h^i \cdot h^j {\int_0^1(1 - t)\frac{\partial^2f}{\partial x^i\partial x^j}(a + th)dt}
> \end{align*}
> $
> where $\omega$ vanishes on each term of the sum because they are all products of smooth functions that vanish on $h = 0$. As $f(a)$ is constant, $\omega(f)$ only depends on $\omega Df(a)(h)$ where
>
> $
> \begin{align*}
> w(f) &= w(Df(a)(h)) \\
> &= \sum_{i \in [n]}\frac{\partial f}{\partial x^i}(a) \cdot w(h_i)
> \end{align*}
> $
> since $Df(a)(h)$ is just a vector of each [[Partial Derivative|partial derivative]], $\omega