NUS-MATH530 Geometric Intepretations
Last edited: August 8, 2025Dot product
Calculations
Let’s calculate some dot products!
\begin{equation} \begin{pmatrix} 1 \\ 0 \end{pmatrix} \cdot \begin{pmatrix} 0 \\ 1 \end{pmatrix} = 0 \end{equation}
\begin{equation} \begin{pmatrix} 1 \\2 \end{pmatrix} \cdot \begin{pmatrix} 2 \\1 \end{pmatrix} = 4 \end{equation}
\begin{equation} \begin{pmatrix} 1 \\ 1 \end{pmatrix} \cdot \begin{pmatrix} -1 \\1 \end{pmatrix} = 0 \end{equation}
\begin{equation} \begin{pmatrix} 1 \\1 \end{pmatrix} \cdot \begin{pmatrix} 2 \\ 2 \end{pmatrix} = 4 \end{equation}
Interpretation
Geometrically, the intepretation of the dot product is the magnitude that comes from scaling the bottom projected value by the top value. This is essentially multiplying the proportion of one vector that’s parallel to the other by each other.
NUS-MATH530 Geometric Multiplicity
Last edited: August 8, 2025Let \(\lambda_{m}\) be an eigenvalue for \(T\) an operator on complex finite-dimensional \(V\). Let \(m\) be the geometric multiplicity of \(\lambda_{m}\). We desire that the algebraic multiplicity is at least \(m\). Let \(\dim v = n\).
We have that \(m\) is the geometric multiplicity of \(\lambda_{m}\), meaning:
\begin{equation} \dim E(\lambda_{m}, T) = m \end{equation}
This means we can take \(m\) linearly independent eigenvectors from \(V\). Extend this list now to a basis of \(V\) with \(v_1, …v_{m}, u_{1}, u_{n-m}\).
NUS-MATH530 Homework Index
Last edited: August 8, 2025NUS-MATH530 Linear Vehicles
Last edited: August 8, 2025Infinite Plane
Two Vehicles
Yes. Though the travel of the two vehicles are not entirely independent, the second vehicle can diagonally traverse the plane while the first vehicle cuts across it. Practically, the question asks whether or not a combination of:
\begin{equation} \alpha \begin{pmatrix} 0 \\ 1 \end{pmatrix} + \beta \begin{pmatrix} 1 \\ 1 \end{pmatrix} \end{equation}
Can form all vectors in \(\mathbb{R}^2\). Expanding that expression out, we have, given some point \((a,b)\) that:
NUS-MATH530 Matrix Adjectives
Last edited: August 8, 2025Factoids:
- \((AB)^{*} = B^{*} A^{*}\), \((A+B)^{*} = A^{*} + B^{*}\)
An unitary operator is invertible, and the inverse of its matrix representation is its transpose
Take \(M\) an unitary square matrix, with orthonormal columns. Note that this matrix, by construction, sends each basis \(v_{j}\) to $ej$—a set of \(dim\ V\) (as there are \(dim\ V\) columns to \(M\)) linearly independent (as \(e_{j}\), through orthonormality, are linearly independent) vectors. As we have \(dim\ V\) linearly independent vectors, the \(e_{j}\) form a basis. As each \(v_{j}\) is sent to $ej$—both a basis of $V$—we note that the finite-dimensional operator corresponding to \(M\) is subjective and hence invertible.