> [!definition]
>
> Let $\seqi{X}$ be a family of [[Random Variable|random variables]] defined on the same [[Probability|probability space]], then the family forms a **stochastic process**.
> [!definition]
>
> A stochastic process $\seqi{X}$ is **integrable** if $X_i \in L^1$ for all $i \in I$.
> [!definition]
>
> Let $(\real^d)^{[0, \infty)}$ be the space of functions $[0, \infty) \to \real^d$. Let $t \in [0, \infty)$, define
> $
> \pi_t = \mathcal{P}_t: (\real^d)^{[0, \infty)} \to \real^d \quad f \mapsto f(t)
> $
> as the $t$-th projection. Let $\Sigma^{[0, \infty)}$ be the [[Product Sigma Algebra|product sigma algebra]] generated by the projection maps.
>
> Given a probability space $(\Omega, \cf, \bp)$, $X = \bracs{X_t}_{t \ge 0}$ is a **stochastic process** if $X$ is $(\cf, \Sigma^{[0, \infty)})$-measurable. That is, for each $t \ge 0$, $X_t = \mathcal P_t \circ X$ is a $(\cf, \cb_{\real^d})$ random variable.
>
> If $Y$ is another stochastic process, then $X$ and $Y$ are identically distributed if their distributions agree on the finite intersections of cylinder sets (finite joint distributions).
> [!definition]
>
> Let $X$ and $Y$ be stochastic processes, then they are
> - **Indistinguishable** if $X = Y$ almost surely (for almost every $\omega \in \Omega$, $X_t(\omega) = Y_t(\omega)$ for all $t \ge 0$).
> - $X$ is a **modification** of $Y$ if for every $t \ge 0$, $X_t = Y_t$ almost surely. If $X$ is a modification of $Y$, then they are identically distributed.
> [!definition]
>
> A stochastic process is *Markov* if in any prediction of the future, knowledge of the entire past is of the same knowledge of the present. In other words, the state of the present fully determines the state of the future.
> [!definition]
>
> In the context of a stochastic process, calculating the individual [[Information|information]] content of each random variable in the family only allows taking snapshots of the [[Entropy|entropy]]. Instead, taking the [[Joint Entropy|joint entropy]] of all variables is a better approach. By the [[Entropy Chain Rule|entropy chain rule]],
> $
> H(X_0, \cdots, X_{n+1}) = H(X_0, \cdots, X_n) + H(X_{n + 1}|X_0, \cdots, X_n)
> $
> the joint entropy increases over time as more information is revealed.
> [!definition]
>
> The [[Entropy|entropy]] *rate*, $h(X)$ of a stochastic process $X = (X_n, n \in \integer^+)$ is the average joint entropy increase at each step, if the [[Limit|limit]] exists
> $
> h(X) = \limv{n}\frac{1}{n}H(H_0, \cdots, X_{n - 1})
> $