Posts

SU-ENGR76 MAY022024

Last edited: August 8, 2025

nyquist sampling theorem

Formally, the nyquist limit is states as:

for \(X(t)\) a continuous-time signal with bounded frequency representation which is bounded by \([0, B]\) Hz; if \(X\) is sampled every \(T\) seconds, then if \(T < \frac{1}{2B}\) (sampling interval is smaller than \(1/2B\)) or equivalently \(\frac{1}{T} > 2B\) (sampling frequency is larger than \(2B\)), then \(X\) can be reconstructed from its samples \(X(0), X(T), X(2T), \ldots\).

At every time, we can go back and fourth between \(X\) the samples and sinusoids via:

SU-ENGR76 MAY072024

Last edited: August 8, 2025

Welcome to Unit 2.

Fundamental Problem of Communication

“The fundamental task of communication is that of reproducing at one point either exactly or approximately a message selected at another point”

Now: all communication signals are subject to some noise, so we need to design systems to responds to them.

Most designs center around changing transmitter and receiver, sometimes you can change the channel but often not.

communication

analog communication

  1. convert sound waves into continuous-time electrical signal \(s(t)\)
  2. apply \(s(t)\) directly to the voltage of my channel
  3. on the other end, decode \(s’(t)\), a noisy version of the original signal
  4. speak \(s’(t)\)

digital communication

  1. convert sound waves into continuous-time electrical signal \(s(t)\)
  2. sample \(s(t)\) at a known sampling rate
  3. quantize the results to a fixed number of levels, turning them into discrete symbols
  4. use Huffman Coding to turn them into bits
  5. generate a continuous-time signal of voltage using the bits
  6. communicate this resulting signal over the cable
  7. get the noisy result signal and recovering from bits from it
  8. decode that by using interpolation + codebook
  9. speak data

this format allows us to generalize all communication as taking on type (Bits => Bits), i.e. we only have to design Tx and Rx. This could be much more flexible than Analog Communication.

SU-ENGR76 MAY092024

Last edited: August 8, 2025

digital encoding

We allocating different systems in the same environment different frequency bands; by doing this, we are able to communicate pack information more effectively to prevent interference.

“how do we take a sequence of bits 10100…. and map it to a continuous-time signal \(X(t)\) such that the spectrum of this system is limited to \([0, B]\)”?

sinc digital encoding

IDEA: recall sinc sampling theorem, which (even if under sampled), will recover the source points exactly. As such, we can write:

SU-ENGR76 MAY142024

Last edited: August 8, 2025

High Frequency Signal

Frequency content of a signal would be symmetric around a target frequency \(f_{c}\), and most of the energy will go from \(f_{c} \pm \frac{1}{T}\). Strictly speaking, you maybe leaking some energy outside the band.

energy of signal

\begin{equation} \varepsilon_{1} = \frac{1}{T} \int_{0}^{T} y^{2}(t) \dd{t} \end{equation}

if \(\varepsilon_{s} > \varepsilon_{T}\), then decode \(1\). Otherwise, \(\epsilon_{1} < \epsilon_{T}\), decode \(0\).

Hamming Distance

the Hamming Distance between two sequences is the number of positions in which these two sequences differ from each other

SU-ENGR76 MAY162024

Last edited: August 8, 2025

an is a collection of binary strings such that the minimum between any two codes is some distance \(d\).

This code allows you to correct up to:

  • \(t = \left\lfloor \frac{d_{c}-1}{2}\right\rfloor\) one bit errors
  • and can detect up to \(d-1\) errors (otherwise we can’t get up to a minimum size codeword

minimum length

what is the largest \(M\) we can have for a code with each codeword of length \(L\) and minimum inter-codeword \(d_{c} = 3\).