Houjun Liu

SU-ENGR76 MAY302024

Consider a mapping between \(X \in \{0,1\}\), and output \(Y \in \{0,1\}\), with a chance of error of probability \(p\). So we have four outcomes:

XYp
001-p
111-p
01p
10p

Writing this out:

  • \(P(y=1|x=0) = p\)
  • \(P(y=0|x=1) = p\)
  • \(P(y=0|x=0) = 1-p\)
  • \(P(y=1|x=1) = 1-p\)

Recall chain rule:

\begin{equation} P(X,Y|A) = P(X|Y,A) P(Y|A) \end{equation}


Consider some input output pair:

\begin{equation} p(y=10 | x = 00) = p(y_1=1 | x_1=0, x_2=0)p(y_2=0| y_1=1, x_1=0, x_2=0) \end{equation}

A memoryless channel becomes1

\begin{equation} p(y_1=1 | x_1=0, x_2=0)p(y_2=0| y_1=1, x_1=0, x_2=0) = p(y_1=1 | x_1=0)p(y_2=0|x_2=0) \end{equation}

This means2

\begin{equation} p(y=r_1 \dots r_{n} | x = t_1 \dots t_{k}) = p(r_1|t_1) p(r_2|t_2) \dots p(r_{n}|t_{n}) \end{equation}

Shannon’s Channel-Coding Theorem

For any communication channel (characterized by \(p(y|x)\), for a pair of input and output), there exists a way to construct an encoder/decoder pair that make it possible to send information over this channel at some rate \(R\) (bits/channel use) while making the probability of error arbitrarily small as long as \(R < C\).

Conversely, trying to send information at rate \(R > C\) bits/channel use, errors are inevitable.

\begin{equation} C = \max_{\text{dist}(X)} I (x:y) \end{equation}

the capacity of a channel is the maximum mutual information that could be given any choice of input distribution. you should choose the input that induces the maximum mutual information between the input and output of a channel. where \(I\) is the mutual information.

Law of Large Numbers

The average value of many, independent repeated trials is close to the true expectation.

Because the law of large numbers (Wolf et al. 2020), the we can transmit the number of bit flips over a binary substitution channel with probability \(Lp\).

So: design a code with very long codewords \(L \gg 0\) that can correct \(L(p+\epsilon)\) bitflips for some small choice of \(\epsilon > 0\).


  1. test foonote ↩︎

  2. its true though it may not ↩︎