> [!abstract] Introduction > > A communication channel is a [[Scientific Model|model]] for the transmission of [[Information|information]] from one point to another, in which a **source** transmits information across some **channel** that the **receiver** picks up. The channel is always subject to *noise*, uncontrollable random distortions that may lead to a loss of information by the receiver. > > The source is modelled by a [[Random Variable|random variable]] $S$ with [[Probability Distribution|probability distribution]] $\list{p}{n}$, whose outcomes $\list{a}{n}$ is the **source alphabet**. The uncertainty in $S$ represents the lack of knowledge of the information that the source will send. > > The receiver is modelled by another random variable $R$ with [[Probability Distribution|probability distribution]] $\list{q}{m}$, whose outcomes $\list{b}{m}$ is the **receiver alphabet**. Typically, the receiver alphabet has more letters than the source alphabet. > > The distorting effects of the channel is modelled by a series of [[Conditional Probability|conditional probabilities]] $\{P_{S = a_i}(R = b_j); 1 \le i \le n, 1 \le j \le m\}$. For optimal transmission where the source and receiver alphabets are of the same size, $P_{S = a_i}(R = b_i)$ should be as close to 1 as possible, for each $1 \le i \le n$. > [!note] Average Error Probability > > Let $E$ be the [[Event|event]] where an error occurs (where a [[Decision Rule|decision rule]] picks the wrong symbol). Denote the conditional probability $P_{x_j}(E)$ be the probability that an error was made, given that $x_j$ was sent out. The *average* error probability is then > $ > P(E) = \sum_{j = 1}^{N}P_{x_j}(E)P(x_j) > $ > > In an optimal [[Code|code]], the probability of each code symbol should be the same, simplifying the formula > $ > P(E) = \frac{1}{N}\sum_{j = 1}^{N}P_{x_j}(E) > $ > [!note] Channel Capacity > > The **capacity** $C$ of a communications channel is the *maximum* possible amount of information that it can transmit (the [[Mutual Information|mutual information]] between the source and receiver). Since the information transmitted depends on the probability distribution of the source, the channel capacity is calculated by considering the source distribution that transmits the greatest amount of information. > $ > C = \text{max } I(S, R) = \text{max} (H(R) - H_S(R)) > $ > [!definition] > > The transmission rate $R$ is the average amount of [[Information|information]] transmitted across the channel per unit time, whose maximum value depends on the channel capacity.