> [!quote] Introduction
>
> By the [[Principle of Symmetry|principle of symmetry]], whenever there is *no* information about the events (maximum uncertainty), it should be assumed that they are uniformly distributed. This is confirmed by the fact that the [[Uniform Distribution|uniform distribution]] provides the maximum [[Entropy|entropy]]. Therefore, the principle can be rephrased as the *principle of maximum entropy*. While the change is purely semantic for the uniform case, the principle can now be applied to any number of constraints.
> [!definitionb] Principle
>
> Given a [[Random Variable|random variable]] $X$ with unknown [[Probability Distribution|probability distribution]] $\{p_1, p_2, \cdots, p_n\}$, always choose the distribution as to [[Extrema|maximise]] the [[Entropy|entropy]] $H(X)$ subject to any of the known constraints.