Posts

SU-ENGR76 APR182024

Last edited: August 8, 2025

Fourier Series as exactly a shifted sum of sinusoids

Key idea: every periodic function with period \(L\) can be represented as a sum of sinusoids

\begin{equation} f(t) = A_0 + \sum_{i=1}^{\infty} B_{j} \sin \qty(k \omega t + \phi_{j}) \end{equation}

where \(\omega = \frac{2\pi}{T}\). notice! without the \(A_0\) shift, our thing would integrate to \(0\) for every \(L\); hence, to bias the mean, we change \(A_0\).

Now, we ideally really want to get rid of that shift term \(\phi\), applying the sin sum formula:

SU-ENGR76 APR232024

Last edited: August 8, 2025

Fourier Series components form a basis

Recall the definition of a basis, and in particular what an orthonormal basis is. In particular, recall that writing a vector as a linear combination of orthonormal basis is a thing you can do very easily.

Recall

Importantly, the Fourier Series is defined as:

\begin{equation} f(x) = a_0 + \sum_{k=1}^{\infty} \qty( a_{k} \cos(k \omega x) + b_{k} \sin(k \omega x)) \end{equation}

where \(\omega = \frac{2\pi}{L}\), and

\begin{equation} a_0 = \frac{\langle f, 1 \rangle}{ \langle 1,1 \rangle} = \frac{1}{L} \int_{0}^{L} f(x) \dd{x} \end{equation}

SU-ENGR76 APR252024

Last edited: August 8, 2025

Every periodic function with period \(T\) can be written as a linear combination:

\begin{equation} f(t) = b_{0} + \sum_{j=1}^{\infty}a_{j} \sin \qty( 2\pi \frac{j}{T} t) + b_{j} \cos \qty(2\pi \frac{j}{T} t) \end{equation}

Finite-Bandwidth Signal

If the summation here is finite, we call this representation as finite-bandwidth. You can draw out two separate stem plots, representing the \(\sin\) term frequencies and the \(\cos\) term frequencies.

Bandwidth

For a particular signal, identify the largest and smallest frequency corresponding to non-zero coefficients, then our bandwidth is defined by:

SU-ENGR76 APR302024

Last edited: August 8, 2025

Discrete Fourier Transform

The matrix operation is computationally intractable as it scales with \(O(N^{2})\). The complexity can be reduced via a Fast-Fourier Transform with \(O(n\log n)\) time.

We can compute the Fourier representation forward and backwards by inverting the Fourier matrix

Source Coding Review

Basic Source

We can just do Huffman Coding directly.

Continuous Real Source

We can quantize the continuous source, and then do Huffman Coding.

Continuous-Time Source

Few strategies to get discrete symbols.

SU-ENGR76 JUN042024

Last edited: August 8, 2025

Recall that we care about three things: \(M, L, d_{\min}\). In a repetition code, as code-word size increases, our error probability decreases:

Lp
3p^2
5p^3

Recall Shannon’s Channel-Coding Theorem.