> [!definition]
>
> Let $(\real^n, \cb(\real^n), \mu)$ be a [[Probability|probability]] space on the Euclidean space. The **characteristic function** of $\mu$ is the complex-valued function $\hat\mu: \real^n \to \complex$, where
> $
> \hat \mu (v) = \int e^{i\angles{x, v}}d\mu(x)
> $
> then $\hat \mu$ always exists and is in $L^\infty(\mu)$.
> [!theorem]
>
> Let $\mu$ and $\nu$ be probability measures on $\real^n$, then
> 1. $\hat \mu(0) = 1$.
> 2. $\hat \mu$ is [[Continuity|continuous]], by the [[Dominated Convergence Theorem]].
> 3. If $\mu$ is symmetric ($\mu(E) = \mu(-E)$ for all $E \in \cb(\real)$), then $\hat \mu$ is real-valued.
> 4. $\wh{\mu * \nu} = \hat \mu \cdot \hat \nu$, by linearity of the inner product.
>
> *Proof*. Let $X$ and $Y$ be independent [[Random Variable|random variables]] with $\mu$ and $\nu$ as their distributions. Then for any $v \in \real^n$,
> $
> \begin{align*}
> \wh{\mu * \nu}(v) &= \ev\braks{ e^{i\angles{X + Y, v}}} \\
> &= \ev \braks{e^{i\angles{X, v}}} \cdot \ev\braks{e^{i\angles{Y, v}}} \\
> &= (\hat \mu \cdot \hat \nu)(v)
> \end{align*}
> $
> [!theorem]
>
> Let $X \in L^p$ $(p \in \nat)$ be a [[Random Variable|random variable]] with distribution $\mu$, then $\hat \mu$ is of class [$C^p$](Space%20of%20Continuously%20Differentiable%20Functions)[^1]. Let $\alpha$ be a [[Multi-Index|multi-index]] with $\abs{\alpha} \le p$, then
> $
> \partial^\alpha \hat \mu(v) = \int_{\real^n}(ix)^{\alpha} \cdot e^{i\angles{x, v}}d\mu(x)
> $
> In particular,
> $
> \partial^\alpha \hat \mu(0) = i^{\abs{\alpha}} \ev\braks{X^\alpha}
> $
> known as the **cross moments** of $X$.
> [!theorem]
>
> Let $\gamma_{m, \sigma^2}$ be a [[Normal Distribution|Gaussian distribution]] on $\real$, then
> $
> \hat\gamma_{m, \sigma^2}(v) = e^{ivm} \cdot e^{-\sigma^2v^2/2}
> $
> If $\gamma_{m, C}$ is a Gaussian distribution on $\real^n$, then
> $
> \hat \gamma_{m, C}(v) = e^{i\angles{v, m}} \cdot e^{-vCv/2}
> $
> [!theorem]
>
> Let $f: \real^d \to \real$, then the Fourier transform of $f$ is
> $
> \hat f(v) = \int_{\real^d}e^{i\angles{x, v}}f(x)dx
> $
> If $\nu$ is a probability measure with density $f$ with respect to the Lebesgue measure, then $\hat \nu = \hat f$.
> [!theorem]
>
> Let $\mu$ and $\nu$ be two probability measures on $\real^d$. Then $\mu = \nu$ if and only if $\hat \mu = \hat \nu$. Hence the characteristic function uniquely determines the probability measure.
>
> ### Open Sets Reduction
>
> $\mu = \nu$ if and only if $\mu|_\topo = \nu|_\topo$, where $\topo$ is the collection of [[Open Set|open sets]] on $\real^n$.
>
> *Proof*. Since the open sets form a $\pi$-system that generates $\cb(\real)$, by [[Dynkin's Uniqueness Theorem]].
>
> ### Continuous Approximation of Indicators
>
> Let $B \subset \real^n$ be open, then there exists $\seq{f_k} \subset BC(\real^d, [0, 1])$ such that $f_k \upto \one_{B}$. Hence, if $\mu(f) = \int f d\mu = \int f d\nu = \nu(f)$ for all $f \in BC(\real^n)$, $\mu = \nu$ by the [[Monotone Convergence Theorem]].
>
> *Proof*. Let $d(x, B) = \inf_{y \in B}d(x, y)$ be the distance function. Let
> $
> f_k = \braks{\frac{d(x, B^c)}{1 + d(x, B^c)}}^{1/k}
> $
> then $f_k \upto \one_B$.
>
>
> ### Smooth Approximation of Continuous Functions
>
> Let $f \in BC(\real^n)$, then there exists a sequence $\seq{f_k} \subset C_c^\infty(\real^n)$ such that $\norm{f_k}_u \le \norm{f}_u$ for all $k$, and $f_k \to f$ pointwise and [[Uniform Convergence on Compact Sets|uniformly on compact sets]].
>
> Hence if $\mu(f) = \nu(f)$ for all $f \in C_c^\infty(\real^n)$, then $\mu = \nu$.
>
> *Proof*. First suppose that $f$ is compactly supported, then use [[Method of Successive Approximations]] and [[Smooth Bump Function|bump functions]].
>
> ### Plancherel's Theorem (Task 2)
>
> Let $\psi \in C^\infty_c(\real^n)$ and $\mu$ be a probability measure on $\real^n$. Then
> $
> \begin{align*}
> \mu(\psi) &= \frac{1}{(2\pi)^n} \angles{\hat \psi, \hat \mu} \\
> &= \frac{1}{(2\pi)^n}\int \hat \psi(x) \cdot \hat \mu(x)dx
> \end{align*}
> $
> Hence $\mu = \nu$ if $\hat \mu = \hat \nu$.
> [!theorem]
>
> Let $\mu$ be a [[Probability|probability]] measure, and $r, R > 0$. Then for every $x \in \real^d$ with $\norm{x} = 1$,
> $
> \abs{1 - \hat \mu(rx)} \le rR + 2\mu(\bracs{y \in \real^d: \abs{\angles{x, y}} > R})
> $
> and
> $
> \mu(\bracs{y \in \real^d: \abs{\angles{x, y}} > R}) \le \braks{\frac{1}{r}\int_0^r \abs{1 - \hat \mu(sx)}ds} \cdot \frac{1}{m(rR)}
> $
> where
> $
> m(t) = \inf_{\abs{\theta} > t}\braks{1 - \frac{\sin \theta }{\theta}}
> $
> *Proof*. Let $E = \bracs{y \in \real^d: \abs{\angles{x, y}} \le R}$, then
> $
> \begin{align*}
> \abs{1 - \hat \mu(rx)} &= \abs{\int \paren{1 - e^{i\angles{y, rx}}}d\mu(y)} \\
> &\le \abs{\int_E\paren{1 - e^{i\angles{y, rx}}}d\mu(y)} \\
> &+ \abs{\int_{E^c}\paren{1 - e^{i\angles{y, rx}}}d\mu(y)}
> \end{align*}
> $
> The second integrand is bounded by $2\mu(E^c)$. On the other hand,
> $
> \abs{\int_E\paren{1 - e^{i\angles{y, rx}}}d\mu(y)}^{1/2} \le \int_E\abs{1 - e^{i\angles{y, rx}}}^2d\mu(y)
> $
> and
> $
> \abs{1 - e^{i\angles{y, rx}}}^2 \le \abs{\angles{y, rx}}^2 \le r^2R^2
> $
> so the first integrand is bounded by $rR$.
>
> Now for the second part,
> $
> \begin{align*}
> \frac{1}{r}\int_0^r \abs{1 - \hat \mu(sx)}ds &= \frac{1}{r}\int_0^r \abs{\int_{\real^d} \paren{1 - e^{i\angles{y, sx}}}d\mu(y)}ds \\
> &\ge \frac{1}{r}\int_0^r \abs{\int_{\real^d} (1 - \cos(\angles{y, sx}))d\mu(y)}ds \\
> &\ge \frac{1}{r} \int_{E^c}\int_0^r (1 - \cos(s\angles{y, x}))ds d\mu(y) \\
> &\ge \int_{E^c}\paren{1 - \frac{\sin(r\angles{y, x})}{r\angles{y, x}}}d\mu(x) \\
> &\ge \frac{1}{m(rR)} \mu(E^c)
> \end{align*}
> $
[^1]: The converse does not hold.