_index.org

NUS-MATH530 Geometric Multiplicity

Last edited: August 8, 2025

Let \(\lambda_{m}\) be an eigenvalue for \(T\) an operator on complex finite-dimensional \(V\). Let \(m\) be the geometric multiplicity of \(\lambda_{m}\). We desire that the algebraic multiplicity is at least \(m\). Let \(\dim v = n\).

We have that \(m\) is the geometric multiplicity of \(\lambda_{m}\), meaning:

\begin{equation} \dim E(\lambda_{m}, T) = m \end{equation}

This means we can take \(m\) linearly independent eigenvectors from \(V\). Extend this list now to a basis of \(V\) with \(v_1, …v_{m}, u_{1}, u_{n-m}\).

NUS-MATH530 Linear Vehicles

Last edited: August 8, 2025

Infinite Plane

Two Vehicles

Yes. Though the travel of the two vehicles are not entirely independent, the second vehicle can diagonally traverse the plane while the first vehicle cuts across it. Practically, the question asks whether or not a combination of:

\begin{equation} \alpha \begin{pmatrix} 0 \\ 1 \end{pmatrix} + \beta \begin{pmatrix} 1 \\ 1 \end{pmatrix} \end{equation}

Can form all vectors in \(\mathbb{R}^2\). Expanding that expression out, we have, given some point \((a,b)\) that:

NUS-MATH530 Matrix Adjectives

Last edited: August 8, 2025

Factoids:

  • \((AB)^{*} = B^{*} A^{*}\), \((A+B)^{*} = A^{*} + B^{*}\)

An unitary operator is invertible, and the inverse of its matrix representation is its transpose

Take \(M\) an unitary square matrix, with orthonormal columns. Note that this matrix, by construction, sends each basis \(v_{j}\) to $ej$—a set of \(dim\ V\) (as there are \(dim\ V\) columns to \(M\)) linearly independent (as \(e_{j}\), through orthonormality, are linearly independent) vectors. As we have \(dim\ V\) linearly independent vectors, the \(e_{j}\) form a basis. As each \(v_{j}\) is sent to $ej$—both a basis of $V$—we note that the finite-dimensional operator corresponding to \(M\) is subjective and hence invertible.

NUS-MATH530 Plane and 1.B

Last edited: August 8, 2025

Equation of a Plane

We want to determine all points on the plane formed by two vectors.

Let’s take two vectors \(\vec{u} \in V\) and \(\vec{v} \in V\). The orthogonal vector to the both of them (i.e. the normal direction of the plane) is:

\begin{equation} \vec{u}\times \vec{v} \end{equation}

by the definition of the cross product.

The points on the plane, therefore, have to be orthogonal themselves to this normal vector. This means that the dot product of the candidate vector against these vectors should be \(0\):