_index.org

NUS-MATH530 Linear Vehicles

Last edited: August 8, 2025

Infinite Plane

Two Vehicles

Yes. Though the travel of the two vehicles are not entirely independent, the second vehicle can diagonally traverse the plane while the first vehicle cuts across it. Practically, the question asks whether or not a combination of:

\begin{equation} \alpha \begin{pmatrix} 0 \\ 1 \end{pmatrix} + \beta \begin{pmatrix} 1 \\ 1 \end{pmatrix} \end{equation}

Can form all vectors in \(\mathbb{R}^2\). Expanding that expression out, we have, given some point \((a,b)\) that:

NUS-MATH530 Matrix Adjectives

Last edited: August 8, 2025

Factoids:

  • \((AB)^{*} = B^{*} A^{*}\), \((A+B)^{*} = A^{*} + B^{*}\)

An unitary operator is invertible, and the inverse of its matrix representation is its transpose

Take \(M\) an unitary square matrix, with orthonormal columns. Note that this matrix, by construction, sends each basis \(v_{j}\) to $ej$—a set of \(dim\ V\) (as there are \(dim\ V\) columns to \(M\)) linearly independent (as \(e_{j}\), through orthonormality, are linearly independent) vectors. As we have \(dim\ V\) linearly independent vectors, the \(e_{j}\) form a basis. As each \(v_{j}\) is sent to $ej$—both a basis of $V$—we note that the finite-dimensional operator corresponding to \(M\) is subjective and hence invertible.

NUS-MATH530 Plane and 1.B

Last edited: August 8, 2025

Equation of a Plane

We want to determine all points on the plane formed by two vectors.

Let’s take two vectors \(\vec{u} \in V\) and \(\vec{v} \in V\). The orthogonal vector to the both of them (i.e. the normal direction of the plane) is:

\begin{equation} \vec{u}\times \vec{v} \end{equation}

by the definition of the cross product.

The points on the plane, therefore, have to be orthogonal themselves to this normal vector. This means that the dot product of the candidate vector against these vectors should be \(0\):

NUS-MATH530 Similar to Diagonal

Last edited: August 8, 2025

Prove but \(T\) is diagonalizable if and only if the matrix of \(T\) is similar to a diagonal matrix.

Try 2.


Given similarity:

So we have that:

\begin{equation} D = S^{-1} A S \end{equation}

where, \(D\) is diagonal. We apply \(S\) to both sides to yield:

\begin{equation} SD = AS \end{equation}

Now, note that \(S\) is invertible. This means that its column s are linearly independent (as it is an operator, which means it is injective, and hence has a zero null space; that indicates that the dimension of its range is that of the whole space: indicating its columns vectors are spanning; there is \(dim\ V\) such columns, so it is a basis and hence linearly independent).