> [!definition]
>
> Let $X$ be a [[Banach Space|Banach space]]. For any $\rho \ge 0$ denote
> $
> M_\rho = \bracs{x \in X: \norm{x} \le \rho}
> $
> A collection $\Sigma \subset X$ satisfies the *reverse Lipschitz* criterion if there exists $\gamma \in (0, 1)$ and $C > 0$ such that for any $x \in M_\varepsilon$, there exists an approximating element $\phi \in \Sigma$ with $\norm{x - \phi} \le \gamma\varepsilon$.
>
> The precision of the approximation scales with the norm of the target, and the error must be in a way, consistently less than the norm of the target.
> [!theorem] Method of Successive Approximations (Same Space)
>
> Let $X$ be a Banach space, and $\Sigma \subset X$ be a set satisfying the reverse Lipschitz criterion, then for all $x \ne 0$, there exists a [[Sequence|sequence]] $\seq{\phi_n} \subset \Sigma$ such that $\sum_{n = 1}^{\infty}\phi_n = x$.
>
> *Proof*. Let $x_0 = x$. For $n \in \nat_0$, choose $\phi_{n + 1} \in \Sigma$ such that $\norm{x_n - \phi_{n + 1}} \le \gamma\norm{x_n}$, and take the remainder $x_{n + 1} = x_n - \phi_{n + 1}$.
>
> Leveraging the completeness of $X$, if we can control the norms of each $\phi_n$, we can force the sum to converge. Since
> $
> \norm{\phi_{n + 1}} \le \norm{\phi_{n + 1} - x_n} + \norm{x_n} \le (1 + \gamma)\norm{x_n} < 2\norm{x_n}
> $
> and
> $
> \norm{\phi_{n + 1}} \le 2\norm{x_n} \le 2\gamma\norm{x_{n - 1}} \le \cdots \le 2\gamma^{n}\norm{x_0} \quad \forall n \in \nat_0
> $
> from induction, with $\gamma \in (0, 1)$, $\sum_{n \in \nat}\norm{\phi_n} < \infty$ and $\sum_{n = 1}^{\infty}\phi_n$ converges.
>
> Now to bound the remainder, since $\norm{x_n} \le \gamma^n\norm{x_0}$,
> $
> \norm{x_0 - \sum_{n = 1}^{\infty}\phi_n} = \limv{n}\norm{x - \sum_{n = 1}^{n}\phi_n} \le \limv{n}\gamma^n\norm{x_0} = 0
> $
> yielding $x = \sum_{n = 1}^{\infty}\phi_n$.
> [!theorem] Method of Successive Approximations (Two Spaces)
>
> Let $X$ and $Y$ be Banach spaces and $T: Y \to X$ be a [[Continuity|continuous]] [[Linear Transformation|linear map]].
>
> Let $\Sigma \subseteq Y$. If there exists $C > 0$, $\gamma \in (0, 1)$ such th[](Linear%20Transformation.md)\varepsilon$, there exists $\phi \in \Sigma$ such that
> $
> \norm{x - T(\phi)}_X \le \gamma\varepsilon \quad \norm{\phi}_Y \le C\varepsilon
> $
> Then for any $x \in X, x \ne 0$, there exists $\seq{\phi_n}$ such that $x = T\paren{\sum_{n = 1}^{\infty}\phi_n}$.
>
> *Proof*. Using a similar inductive construction: Let $x_0 = x$. For any $n \in \nat_0$, choose $\phi_{n + 1} \in \Sigma$ such that $\norm{\phi_{n + 1}} \le C\norm{x_n}$ and $\norm{x_n - T(\phi_{n + 1})} \le \gamma\norm{x_n}$, and take $x_{n + 1} = x_n - \phi_n$.
>
> Since
> $
> \norm{\phi_{n + 1}} \le C\norm{x_n} \le \gamma\norm{x_{n - 1}} \le \cdots \le \gamma^{n}\norm{x_0} \quad \forall n \in \nat_0
> $
> with $\gamma \in (0, 1)$, $\sum_{n = 1}^{\infty}\norm{\phi_n} < \infty$ and $\sum_{n = 1}^{\infty}\phi_n$ converges.
>
> As $T$ is linear and continuous, with $\norm{x_n} \le \gamma^n\norm{x_0}$ we obtain
> $
> \begin{align*}
> \norm{x_0 - T\paren{\sum_{n = 1}^{\infty}\phi_n}} &= \limv{n}\norm{x_0 - T\paren{\sum_{k = 1}^{n}\phi_k}} \\
> &= \limv{n}\norm{x_0 - \sum_{k = 1}^{n}\phi_k} \\
> &\le \limv{n}\norm{x_n} = \limv{n}\gamma^n\norm{x_0} \\
> &= 0
> \end{align*}
> $