> [!theorem]
>
> Let $(V, \inp)$ be an [[Inner Product|inner product space]], then for any $u, v \in V$ we have
> $
> \abs{\angles{u, v}} \le \norm{u} \cdot \norm{v}
> $
> under the induced norm, with equality only when $u$ and $v$ are linearly dependent.
>
> *Proof*. If $\norm{v} = 0$, then $v = 0$ and the inequality holds. Otherwise let $\alpha = \frac{\angles{u, v}}{\norm{v}^2}$, then
> $
> \begin{align*}
> 0 &\le \norm{u - \alpha v}^2 \\
> &= \angles{u - \alpha v, u - \alpha v} \\
> &= \angles{u, u - \alpha v} - \angles{\alpha v, u - \alpha v} \\
> &= \angles{u, u} - \angles{u, \alpha v} - \angles{\alpha v, u} + \angles{\alpha v, \alpha v} \\
> &= \norm{u}^2 - \ol\alpha\angles{u, v} - \alpha\angles{v, u} + \alpha\ol{\alpha}\norm{v}^2 \\
> &= \norm{u}^2 - \ol{\alpha}\angles{u, v} - \alpha\ol{\angles{u, v}} + \alpha\ol\alpha \norm{v}^2 \\
> &= \norm{u}^2 - 2\frac{\ol{\angles{u, v}} \cdot \angles{u, v}}{\norm{v}^2} + \frac{\ol{\angles{u, v}} \cdot \angles{u, v}}{\norm{v}^2} \\
> &= \norm{u}^2 - \frac{\ol{\angles{u, v}} \cdot \angles{u, v}}{\norm{v}^2} \\
> \ol{\angles{u, v}} \cdot \angles{u, v} &\le \norm{u}^2\cdot \norm{v}^2 \\
> \abs{\angles{u, v}}^2 &\le \norm{u}^2\cdot \norm{v}^2
> \end{align*}
> $
> If $u$ and $v$ are linearly dependent and $v = tu$, then
> $
> \begin{align*}
> u - \alpha v &= u - \frac{\angles{u, v}}{\angles{v, v}}v \\
> &= u - \frac{\ol t \angles{u, v}}{t \ol{t}\angles{u, v}}v \\
> &= u - \frac{tu}{t} = 0
> \end{align*}
> $
> And the converse applies by reversing the steps.
> [!theorem] Probability Version
>
> Let $X$ and $Y$ be [[Random Variable|random variables]], their [[Expectation|expected values]] must satisfy:
> $
> \ev(XY)^2 \le \ev(X^2) \ev(Y^2)
> $
>
> *Proof.* Consider the non-negative random variable $(X - tY)^2$ where $t \in \real$. Since it's non-negative, $\ev((X - tY)^2) \ge 0$.
> $
> \begin{align*}
> t^2 \ev(Y^2) - 2t \ev(XY) + \ev(X^2) \ge 0
> \end{align*}
> $
> Since this quadratic equation only has solutions when $b^2 \le 4ac$, $\ev(XY)^2 \le \ev(X^2)\ev(Y^2)$.
>
> [!theorem] Corollary
>
> The inequality may be used to bound the [[Covariance|covariance]] and [[Pearson Correlation Coefficient|correlation coefficient]] of the two variables.
>
> $
> \begin{align*}
> |\cov{X, Y}| &\le \sigma_x \sigma_y \\
> -1 \le \rho(X, Y) &\le 1
> \end{align*}
> $
> *Proof*.
>
> By the previous theorem:
> $
> \begin{align*}
> \ev(XY)^2 &\le \ev(X^2) \ev(Y^2) \\
> \ev((X - \mu_X)(Y - \mu_Y))^2 & \le \ev((X - \mu_X)^2)\ev((Y - \mu_Y)^2) \\
> \cov{X, Y}^2 & \le \var{X}\var{Y} \\
> |\cov{X, Y}| & \le \sigma_X \sigma_Y \\
> \frac{|\cov{X, Y}|}{\sigma_X \sigma_Y} &\le 1
> \end{align*}
> $
> [!theorem]
>
> Let $X$ and $Y$ be random variables. *If*[^1] $Y = mX$ for some constant $m$:
> $
> \ev{(XY)^2} = \ev(X^2)\ev(Y^2)
> $
> *If* $X$ and $Y$ are [[Correlation|linearly related]] ($Y = mX + c$):
> $
> \begin{align*}
> |\cov{X, Y}| &= \sigma_Y \sigma_Y \\
> \rho(X, Y) &= \pm 1 \\
> \end{align*}
> $
>
> *Proof*. If $Y = mX$,
> $
> \begin{align*}
> \ev{(XY)^2} &= \ev{(X(mX))^2} \\
> &= m^2\ev(X^2)^2 \\
> &= m^2 \ev(X^2) \ev(X^2) \\
> &= \ev(Y^2)\ev(X^2)
> \end{align*}
> $
> If $(Y - \mu_Y) = m(X - \mu_X)$,
> $
> \begin{align*}
> \ev((X - \mu_X)(Y - \mu_Y))^2 & = \ev((X - \mu_X)^2)\ev((Y - \mu_Y)^2) \\
> \cov{X, Y}^2 &= \var{X}\var{Y} \\
> |\cov{X, Y}| &= \sigma_X \sigma_Y \\
> \rho(X, Y)&= \frac{|\cov{X, Y}|}{\sigma_X \sigma_Y} = 1
> \end{align*}
> $
> Expanding the equation, $Y = mX + \mu_y - m\mu_X$, $Y = mX + c$ with $c = \mu_y - m\mu_X$. Since there is no constraints on $\mu_y$, $c$ can be any arbitrary real number, making the two random variables linearly related.
[^1]: The textbook theorem is *if and only if*, however since I don't understand non-degenerate random variable pairs, we will reduce the proof to only one direction.