Posts

SU-CS205L JAN212025

Last edited: August 8, 2025

Insights to SVD: “ever matrix is a diagonal matrix, when viewed in the right space”

We can solve a linear system by moving it around:

\begin{align} Ac = b \\ \Rightarrow\ & U \Sigma V^{T} c = b \\ \Rightarrow\ & U \qty(\Sigma V^{T} c) = b \\ \Rightarrow\ & \Sigma V^{T} c = U^{T} b \end{align}

(since \(U\) is orthonormal, we can just flip it to invert it)

Call \(U^{T} b = \hat{b}\), call \(V^{T} c = \hat{c}\). We now have:

SU-CS205L JAN232025

Last edited: August 8, 2025

Issues with Direct Methods

  • for instance, direct solvers have problems at numerical stability issues (for instance numerically stable quadratic formula); for cubics, there maybe unacceptable errors since there’s no such fix

Continuous Collision Detection

Implementing collision detection: three points, generally, are three \(v_1, v_2, v_3\) in \(\mathbb{R}^{3}\); yet, if they become linearly dependent, we know collision happened. In particular if the rank of \(\mqty(v_1, v_2, v_3)\) < 3, we have collided.

Problem! Solving this (taking the determinant of our matrix to figure out when collisions happened) will result in a cubic polynomial! This is numerically quite unstable.

SU-CS205L JAN282025

Last edited: August 8, 2025

Line Search and Steepest Design

Gram-Schmidt For Matrix Orthogonality

You can use Gram-Schmidt to find matrix orthogonality. In particular, for a series of vectors \(s^{(j)}\) forming a matrix \(A\):

\begin{equation} s^{(q)} = s^{(q)}- \sum_{q’=1}^{q-1} \frac{\langle s^{(q)}, s^{(q’)} \rangle_{A}}{\langle s^{(q’)}, s^{(q’)} \rangle_{A}}s^{(q’)} \end{equation}

for Conjugate Gradient, it works out such that only one such dot products is non-zero, so we can write:

\begin{equation} s^{(q)} = r^{(q)} + \frac{r^{(q)}\cdot r^{(q)}}{r^{(q-1)}\cdot r^{(q-1)}} s^{(q-1)} \end{equation}

for residual \(r^{(q)}\), and

SU-CS205L Quiz 2/10

Last edited: August 8, 2025

t’s a bad idea when

SU-CS205L Quiz 3/3

Last edited: August 8, 2025