> [!definition]
>
> Let $\seqf{X_k}$ be a [[Sequence|sequence]] of [[Vector Space|vector spaces]], $X = \bigotimes_{k = 1}^{n}X_k$ be their [[Direct Sum|direct sum]], and $Y$ be another vector space. A map $\lambda: X \to Y$ is **multilinear** if it is [[Linear Transformation|linear]] in each of the coordinates of $X$.
> [!definition]
>
> Let $X, Y$ be a vector space and
> $
> \lambda: \bigotimes_{k = 1}^{n}X \to Y
> $
> be a multilinear map. $\lambda$ is **symmetric** if for any $\sigma \in S_n$, $\lambda \circ \sigma = \lambda$. In other words, the [[Symmetric Group|permutation]] of its arguments do not affect its value.
> [!theorem]
>
> Let $R$ be a commutative [[Ring|ring]], $\seqf{E_j}$ and $F$ be $R$-modules, and $\lambda: \prod_jE_j \to F$ be a multilinear map. Let $x, y \in \prod_j E_j$, then
> $
> \lambda(x) - \lambda(y) = \sum_{j = 1}^n \lambda(x_1, \cdots, x_{j - 1}, x_j - y_j, a_{j + 1}, \cdots, a_n)
> $
> *Proof*, by induction on $n$. The base case is direct. Suppose that the proposition holds for $n$, then
> $
> \begin{align*}
> \lambda(x) - \lambda(y) &= \lambda(x_{\ol{n}}, x_{n + 1}) - \lambda(y_{\ol{n}}, y_{n + 1}) \\
> &= \lambda(x_{\ol n} - y_{\ol{n}}, x_{n + 1}) + \lambda(y_{\ol{n}}, x_{n + 1}) \\
> &- \lambda(y_{\ol{n}}, y_{n + 1} - x_{n + 1}) - \lambda(y_{\ol{n}}, x_{n + 1}) \\
> &= \lambda(x_{\ol n} - y_{\ol{n}}, x_{n + 1}) + \lambda(y_{\ol{n}}, x_{n + 1} - y_{n + 1})
> \end{align*}
> $
> For fixed $x_{n + 1}$, the inductive hypothesis can be applied to yield the desired result.
# Bounded Multilinear Maps
> [!definition]
>
> Let $\seqf{E_j}$ and $F$ be [[Banach Space|Banach spaces]]. Let $\lambda: \prod_j E_j \to F$ be a [[Multilinear Map|multilinear map]], then $\lambda$ is **bounded** if there exists $C \ge 0$ such that
> $
> \norm{\lambda x} \le C \prod_{j}\norm{x_j}
> $
> where the infimum over all such $C$ is the **norm** of $\lambda$, which makes the space $L^n(\seqf{E_j}, F)$ a Banach space.
> [!theorem]
>
> Let $E$ and $F$ be Banach spaces, and let $L^r(E, F)$ be the space of all bounded $r$-linear maps from $\prod_{j = 1}^r E$ to $F$. Then if we fix $F$,
> 1. $L^r$ is a contravariant [[Functor|functor]].
> 2. The mapping $L(E_1, E_2) \to L(L^r(E_2, F), L^r(E_1, F))$ is [[Space of Smooth Functions|smooth]].
>
> *Proof*. Let $T \in L(E_1, E_2)$, and define the mapping
> $
> L^r(T): L^r(E_2, F) \to L^r(E_1, F) \quad \lambda \mapsto \lambda \circ T
> $
> where $T$ is composed at each argument. Note that $L^r$ as a functor admits an extension to $L(E_1, E_2)^r$ via composition at each argument. Here, $L^r$ is a bounded multilinear map, and hence is smooth. Therefore its restriction is also smooth.
> [!theorem]
>
> Let $\cx_1, \cx_2, \cy$ be [[Normed Vector Space|normed spaces]], $\lambda: \cx_1 \times \cx_2 \to \cy$ be a bilinear map, and $\psi: \cx_1 \times \cx_2 \to \cy$ defined on a [[Neighbourhood|neighbourhood]] $U \in \cn(0)$ such that
> $
> \lim_{(x_1, x_2) \to 0}\psi(x_1, x_2) = 0
> $
> and
> $
> \norm{\lambda(x_1, x_2)} \le \norm{\psi(x_1, x_2)} \cdot \norm{x_1} \cdot \norm{x_2}
> $
> then $\lambda = 0$.
>
> *Proof*. Let $(x_1, x_2) \in \cx_1 \times \cx_2$, and $t > 0$ such that $t(x_1, x_2) \in U$, then
> $
> \begin{align*}
> \norm{\lambda(tx_1, tx_2)} &\le \norm{\psi(t x_1, t x_2)} \cdot \norm{t x_1} \cdot \norm{t x_2} \\
> t^2\norm{\lambda(x_1, x_2)} &\le t^2 \cdot \norm{\psi(t x_1, t x_2)} \cdot \norm{x_1} \cdot \norm{x_2} \\
> \norm{\lambda(x_1, x_2)} &\le \norm{\psi(t x_1, t x_2)} \cdot \norm{x_1} \cdot \norm{x_2}
> \end{align*}
> $
> where sending $t \to 0$ yields that $\norm{\lambda(x_1, x_2)} = 0$.
> [!theorem]
>
> Let $\seqf{X_k}$ be a sequence of vector spaces, $X = \bigotimes_{k = 1}^{n}X_k$ be their product, $Y, Z$ be vector spaces. If $\omega: X \to Y$ is a continuous multilinear map, and $\lambda \in L(Y, Z)$ be a [[Bounded Linear Map|bounded linear map]], then
> 1. $\lambda \circ \omega$ is also a continuous multilinear map.
> 2. The composition map $\lambda_*$ with $\omega \mapsto \lambda \circ \omega$ is a bounded linear map.
>
> ### Composition
>
> Let $f \in \prod_{k = 1}^{n}X_k$, then
> $
> \begin{align*}
> \norm{\lambda \circ \omega \circ f} &\le \norm{\lambda} \cdot \norm{\omega \circ f} \\
> &\le \norm{\lambda} \cdot \norm{\omega} \cdot \prod_{k = 1}^{n}\norm{f(k)}
> \end{align*}
> $
> Therefore the composition is bounded.
>
> ### Composition Map
>
> Let $\omega_1, \omega_2$ be multilinear maps and $\alpha$ be a scalar, then
> $
> \begin{align*}
> \lambda(\alpha \omega_1 + \omega_2)(x) &= \lambda (\alpha \omega_1(x) + \omega_2(x)) \\
> &= \alpha \lambda \omega_1(x) + \lambda \omega_2(x)
> \end{align*}
> $
> the composition map is linear.
>
> Let $\omega$ be a multilinear map, then
> $
> \norm{(\lambda \circ \omega)(x)} \le \underbrace{\norm{\lambda} \cdot \norm{\omega}}_{\ge \norm{\lambda \circ \omega}} \cdot \prod_{k = 1}^{n}\norm{x_k}
> $
> we have $\norm{\lambda \circ \omega} \le \norm{\lambda} \cdot \norm{\omega}$. Therefore the composition map is continuous.