SU-ENGR76 APR252024
Last edited: August 8, 2025Every periodic function with period \(T\) can be written as a linear combination:
\begin{equation} f(t) = b_{0} + \sum_{j=1}^{\infty}a_{j} \sin \qty( 2\pi \frac{j}{T} t) + b_{j} \cos \qty(2\pi \frac{j}{T} t) \end{equation}
Finite-Bandwidth Signal
If the summation here is finite, we call this representation as finite-bandwidth. You can draw out two separate stem plots, representing the \(\sin\) term frequencies and the \(\cos\) term frequencies.
Bandwidth
For a particular signal, identify the largest and smallest frequency corresponding to non-zero coefficients, then our bandwidth is defined by:
SU-ENGR76 APR302024
Last edited: August 8, 2025Discrete Fourier Transform
The matrix operation is computationally intractable as it scales with \(O(N^{2})\). The complexity can be reduced via a Fast-Fourier Transform with \(O(n\log n)\) time.
We can compute the Fourier representation forward and backwards by inverting the Fourier matrix
Source Coding Review
Basic Source
We can just do Huffman Coding directly.
Continuous Real Source
We can quantize the continuous source, and then do Huffman Coding.
Continuous-Time Source
Few strategies to get discrete symbols.
SU-ENGR76 JUN042024
Last edited: August 8, 2025Recall that we care about three things: \(M, L, d_{\min}\). In a repetition code, as code-word size increases, our error probability decreases:
| L | p |
|---|---|
| 3 | p^2 |
| 5 | p^3 |
Recall Shannon’s Channel-Coding Theorem.
SU-ENGR76 MAY022024
Last edited: August 8, 2025nyquist sampling theorem
Formally, the nyquist limit is states as:
for \(X(t)\) a continuous-time signal with bounded frequency representation which is bounded by \([0, B]\) Hz; if \(X\) is sampled every \(T\) seconds, then if \(T < \frac{1}{2B}\) (sampling interval is smaller than \(1/2B\)) or equivalently \(\frac{1}{T} > 2B\) (sampling frequency is larger than \(2B\)), then \(X\) can be reconstructed from its samples \(X(0), X(T), X(2T), \ldots\).
At every time, we can go back and fourth between \(X\) the samples and sinusoids via:
SU-ENGR76 MAY072024
Last edited: August 8, 2025Welcome to Unit 2.
Fundamental Problem of Communication
“The fundamental task of communication is that of reproducing at one point either exactly or approximately a message selected at another point”
Now: all communication signals are subject to some noise, so we need to design systems to responds to them.
Most designs center around changing transmitter and receiver, sometimes you can change the channel but often not.
communication
analog communication
- convert sound waves into continuous-time electrical signal \(s(t)\)
- apply \(s(t)\) directly to the voltage of my channel
- on the other end, decode \(s’(t)\), a noisy version of the original signal
- speak \(s’(t)\)
digital communication
- convert sound waves into continuous-time electrical signal \(s(t)\)
- sample \(s(t)\) at a known sampling rate
- quantize the results to a fixed number of levels, turning them into discrete symbols
- use Huffman Coding to turn them into bits
- generate a continuous-time signal of voltage using the bits
- communicate this resulting signal over the cable
- get the noisy result signal and recovering from bits from it
- decode that by using interpolation + codebook
- speak data
this format allows us to generalize all communication as taking on type (Bits => Bits), i.e. we only have to design Tx and Rx. This could be much more flexible than Analog Communication.
