> [!theorem]
>
> Let $\cx$ be a [[Normed Vector Space|normed space]], $\cy$ be a [[Banach Space|Banach space]], $\cm \subset \cx$ be a subspace. Let
> $
> T: \cm \to \cy
> $
> be a [[Bounded Linear Map|bounded linear map]], then there exists a unique extension $\ol{T}: \ol{\cm} \to G$ where $\normn{\ol{T}} = \norm{T}$.
>
> *Proof*. Let $x \in \ol{\cm}$ and $\seq{x_n} \subset \cm$ such that $x_n \to x$ be a supporting [[Sequence|sequence]]. Define $\ol{T}x = \limv{n}Tx_n$, then the limit exists as $T$ is continuous and $\seq{Tx_n}$ is [[Cauchy Sequence|Cauchy]]. Moreover, if there exists a sequence $\seq{x_n'} \subset \cm$ with $x_n' \to x$, then
> $
> \norm{Tx_n - Tx_n'} = \norm{T(x_n - x'_n)} \le \norm{T} \cdot \norm{x_n - x_n'} \to 0
> $
> as $n \to \infty$. Therefore the extension is well-defined.
>
> Let $y \in \ol{\cm}$ with supporting sequence $\seq{y_n} \subset \cm$, then
> $
> \begin{align*}
> \ol{T}(\alpha x + y) &= \limv{n}T(\alpha x + y) \\
> &= \alpha \limv{n}Tx_n + \limv{n}T y_n \\
> &= \alpha \ol{T}x + \ol{T}y_n
> \end{align*}
> $
> we have $\ol T$ being linear.
>
> Lastly, let $x \in \ol\cm$ with supporting sequence $\seq{x_n} \subset \cm$, then
> $
> \normn{\ol Tx} = \limv{n}\norm{Tx_n} \le \limv{n}\norm{T} \cdot \norm{x_n} = \norm{T} \cdot \norm{x}
> $
> and $\normn{\ol T} = \norm{T}$.
> [!theorem]
>
> Let $\cx$ and $\cy$ be [[Topological Vector Space|topological vector spaces]], with $\cy$ being [[Complete Metric Space|complete]] and [[Hausdorff Space|Hausdorff]]. Let $\cm \subset \cx$ be a dense subspace, and $T: \cm \to \cy$ be a continuous linear map, then there exists a unique extension $\ol T: \cx \to \cy$ as a continuous linear map.
>
> *Proof*. Let $x \in \cx$ and $\net{x}$ be a net that converges to $\cx$, then $\net{x}$ is Cauchy. Since $\angles{x_\alpha - x_\beta}_{(\alpha, \beta) \in A^2}$ converges to $0$, $\angles{Tx_\alpha - Tx_\beta}_{(\alpha, \beta) \in A^2}$ converges to $0$ as well. As $\cy$ is complete and Hausdorff, there exists $y \in \cy$ such that $\net{Tx}$ converges to $y$. Suppose that $\netb{x}$ also converges to $x$, then $\angles{x_\alpha - x_\beta}_{(\alpha, \beta) \in A \times B}$ converges to $0$. Therefore $\angles{Tx_\alpha - Tx_\beta}_{(\alpha, \beta) \in A \times B}$ converges to $0$ as well, and the limit is well-defined.