## Convolutional Code

The output sequences of Convolutional Code behaves like a five bit difference in Hamming Distance.

### Decoding

**brute force decoding**: precompute, for a sequence length of \(k\), compute \(2^{k}\) sequneces and what they should correspond to in our target code — of course, this is not computationally feasible- Virtirbi Algorithm: time complexity — \(4(k+2)\), \(8(k+2)\) 4 possible blocks of 2, and 8 comparisons for Hamming Distance: in general \(k_0\)

### in general

- \(k_0\) of source symbols entering the decoder
- \(n_0\) of symbols produced by decoder at each step
- constrain length \(m_0\), how many bits are considered

for the Convolutional Code setup we discussed, we have \(k_0=1\), \(n_0=2\), and \(m_0 = 3\) (one bit produces 2 bits, and we consider 3 bits per step.)

## Comparison of Codes

Code | Rate | M | L | dmin |
---|---|---|---|---|

repetition code | 1/3 | 2 | 3 | 3 |

Hamming Code | 4/7 | 16 | 7 | 3 |

Convolutional Code | 1/2 | ≈ 5 |

## Bit Error Rate

**KEY GOAL**: if I hand you an error rate of the uncoded transmission, what would be the bit error rate of a given resulting error-correction code?