Posts

SU-ENGR76 APR232024

Last edited: August 8, 2025

Fourier Series components form a basis

Recall the definition of a basis, and in particular what an orthonormal basis is. In particular, recall that writing a vector as a linear combination of orthonormal basis is a thing you can do very easily.

Recall

Importantly, the Fourier Series is defined as:

\begin{equation} f(x) = a_0 + \sum_{k=1}^{\infty} \qty( a_{k} \cos(k \omega x) + b_{k} \sin(k \omega x)) \end{equation}

where \(\omega = \frac{2\pi}{L}\), and

\begin{equation} a_0 = \frac{\langle f, 1 \rangle}{ \langle 1,1 \rangle} = \frac{1}{L} \int_{0}^{L} f(x) \dd{x} \end{equation}

SU-ENGR76 APR252024

Last edited: August 8, 2025

Every periodic function with period \(T\) can be written as a linear combination:

\begin{equation} f(t) = b_{0} + \sum_{j=1}^{\infty}a_{j} \sin \qty( 2\pi \frac{j}{T} t) + b_{j} \cos \qty(2\pi \frac{j}{T} t) \end{equation}

Finite-Bandwidth Signal

If the summation here is finite, we call this representation as finite-bandwidth. You can draw out two separate stem plots, representing the \(\sin\) term frequencies and the \(\cos\) term frequencies.

Bandwidth

For a particular signal, identify the largest and smallest frequency corresponding to non-zero coefficients, then our bandwidth is defined by:

SU-ENGR76 APR302024

Last edited: August 8, 2025

Discrete Fourier Transform

The matrix operation is computationally intractable as it scales with \(O(N^{2})\). The complexity can be reduced via a Fast-Fourier Transform with \(O(n\log n)\) time.

We can compute the Fourier representation forward and backwards by inverting the Fourier matrix

Source Coding Review

Basic Source

We can just do Huffman Coding directly.

Continuous Real Source

We can quantize the continuous source, and then do Huffman Coding.

Continuous-Time Source

Few strategies to get discrete symbols.

SU-ENGR76 JUN042024

Last edited: August 8, 2025

Recall that we care about three things: \(M, L, d_{\min}\). In a repetition code, as code-word size increases, our error probability decreases:

Lp
3p^2
5p^3

Recall Shannon’s Channel-Coding Theorem.

SU-ENGR76 MAY022024

Last edited: August 8, 2025

nyquist sampling theorem

Formally, the nyquist limit is states as:

for \(X(t)\) a continuous-time signal with bounded frequency representation which is bounded by \([0, B]\) Hz; if \(X\) is sampled every \(T\) seconds, then if \(T < \frac{1}{2B}\) (sampling interval is smaller than \(1/2B\)) or equivalently \(\frac{1}{T} > 2B\) (sampling frequency is larger than \(2B\)), then \(X\) can be reconstructed from its samples \(X(0), X(T), X(2T), \ldots\).

At every time, we can go back and fourth between \(X\) the samples and sinusoids via: