Linear Algebra Errors
Last edited: August 8, 2025Gaussian Elimination Quiz
- Demonstrate that matrices’ multiplication are not commutative (error: didn’t consider \(m\times m\))
- Which \(2\times 2\) matrices under multiplication form a group? (error: closure need to proved on invertable matrices under multiplication, not just \(2\times 2\))
- Deriving Rotation matrices (error: clockwise vs counter-clockwise)
Linear Independence Quiz
- Connection between linear independence and systems equations (error: beated around the bush) — the matrix of an nxn system of equations has a solution if the matrix’s column vectors is linearly independent
Basis and Dimension Quiz
- put 0 into a basis AAAA not lin. indep; figure out what the basis for a polynomial with a certain root is: it is probably of dimension m (instead of m+1), because scalars doesn’t work in the case of p(3)=0; so basis is just the scalars
- missing some inequality about basis? — its just that lin.idp sets is shorter or equal to basis and spanning sets is longer or equal to basis
Final, part 1
- definition of vector space: scalar multiplication is not an operation
- straight forgot \(dim(U+V) = dim U + dim V - dim (U\cap V)\)
- plane containing \((1,0,2)\) and \((3,-1,1)\): math mistake
- proof: det A det B = det AB
Final, part 2
- Counterproof: If \(v_1 \dots v_4\) is a basis of \(V\), and \(U\) is a subspace of \(V\) with \(v_1, v_2 \in U\) and \(v_3, v_4\) not in \(U\), \(v_1, v_2\) is a basis of \(U\)
- Counterproof: if \(T \in \mathcal{L}(V,V)\) and \(T^{2}=0\), then \(T=0\)
- Counterproof: if \(s,t \in \mathcal{L}(V,V)\), and \(ST=0\), then \(null\ s\) is contained in \(range\ T\)
Product Spaces Quiz
Prove that \(\mathcal{L}(V_1 \times V_2 \times \dots \times V_{m}, W)\) and \(\mathcal{L}(V_1, W) \times \dots \times \mathcal{L}(V_{m}, W)\) are isomorphic
Linear Algebra Errors
Last edited: August 8, 2025Gaussian Elimination Quiz
- Demonstrate that matrices’ multiplication are not commutative (error: didn’t consider \(m\times m\))
- Which \(2\times 2\) matrices under multiplication form a group? (error: closure need to proved on invertable matrices under multiplication, not just \(2\times 2\))
- Deriving Rotation matrices (error: clockwise vs counter-clockwise)
Linear Independence Quiz
- Connection between linear independence and systems equations (error: beated around the bush) — the matrix of an nxn system of equations has a solution if the matrix’s column vectors is linearly independent
Basis and Dimension Quiz
- put 0 into a basis AAAA not lin. indep; figure out what the basis for a polynomial with a certain root is: it is probably of dimension m (instead of m+1), because scalars doesn’t work in the case of p(3)=0; so basis is just the scalars
- missing some inequality about basis? — its just that lin.idp sets is shorter or equal to basis and spanning sets is longer or equal to basis
Final, part 1
- definition of vector space: scalar multiplication is not an operation
- straight forgot \(dim(U+V) = dim U + dim V - dim (U\cap V)\)
- plane containing \((1,0,2)\) and \((3,-1,1)\): math mistake
- proof: det A det B = det AB
Final, part 2
- Counterproof: If \(v_1 \dots v_4\) is a basis of \(V\), and \(U\) is a subspace of \(V\) with \(v_1, v_2 \in U\) and \(v_3, v_4\) not in \(U\), \(v_1, v_2\) is a basis of \(U\)
- Counterproof: if \(T \in \mathcal{L}(V,V)\) and \(T^{2}=0\), then \(T=0\)
- Counterproof: if \(s,t \in \mathcal{L}(V,V)\), and \(ST=0\), then \(null\ s\) is contained in \(range\ T\)
Product Spaces Quiz
- Need more specific description: explain why we use product and quotient to describe product and quotient spaces?
- Prove that \(\mathcal{L}(V_1 \times V_2 \times \dots \times V_{m}, W)\) and \(\mathcal{L}(V_1, W) \times \dots \times \mathcal{L}(V_{m}, W)\) are isomorphic. Error: didn’t do it correctly for infinite dimensional
Quotient Spaces Quiz

- Couldn’t prove that the list in linearly independent: the linear combinations is some \(c_1v_1 + \dots c_{m}v_{m} + U\); as \(v_1 \dots v_{m}\) is a basis of \(V / U\), \(c_1 \dots c_{m} = 0\), now the second part is also a basis so they are \(0\) too.
- The spanning proof: \(v + U =\) , rewrite as basis, etc.
- she graded wrong: what’s the importance of \(\widetilde{T}\)?
- Give two statements equivalent to \(v+U = w+U\), prove equivalence betewen this statement and the others
- didn’t prove both directions!
Polynomials Quiz
- state the fundamental theorem of algebra; error: \(\mathcal{P}_{m}(\mathbb{F})\) is a vector space of polynomials with degree at most \(m\), and yet the FtOA requires exactly \(m\)
Upper Triangular Quiz
- upper-triangular representation is findable when the space is 1) under complexes and 2) for finite-dimensional vector spaces; need BOTH conditions
Upper Triangular Quiz
- UNCLEAR: Geometric Multipliicty is bounded by Algebric Multiplicity; Algebraic multiplicity (“real estate” taken on the upper-triangular diagonal) v. geometric multiplicity (amount of linearly independent eigenvectors included with that eigenvalue); so if geometric multiplicity < algebraic multiplicity, the map is not diagonalizable because its not bringing enough linearly independent eigenvectors
Diagonalization Quiz
- enough eigenvalues go in only one direction: it existing means its diagonalizable, but the opposite isn’t true
- the proof for \(T\) is diagonalizable IFF the matrix \(T\) is similar to a diagonal matrix: NUS-MATH530 Similar to Diagonal
Final, part 1
- State the complex spectral theorem (error: the condition of normality is a PARALLEL result)
Final, Part 2
- Said this was true, but its not; \(null\ T \bigoplus range\ T = V\), \(T\) is diagonalizable;
- Said this was true, but its false \(T^{2}= 0\) IFF \(null\ T = range\ T\) suppose \(T=0\), \(T^{2} = 0\). \(null\ T = V\), \(range\ T = 0\).
- Spectral theorem doesn’t define diagonalizability, it defines diagonalibility for ORTHONORMAL
- missing derivation of the pseudoinverse
Linear Algebra Index
Last edited: August 8, 2025The bible stays the same: (Axler 1997)
We will be less exploratory, Axler will pretty much tell us. However, we should try to say stuff in the class every single class period.
There is a ban on numbers over 4 on this class.
Best Practices
- Ask questions
- Talk to each other
- Make mistakes
- From Riley: know the Proof Design Patterns
Non-Axler but Important
Things we explicitly are told to know, but is not immediately in Axler. You bet you determinants are going to be here.
linear combination
Last edited: August 8, 2025A Linear Combination of vectors is a… guess what? Any vector formed by a combination of vectors at arbitrary scales.
constituents
- A list of vectors \(v_1, \dots,v_{m}\)
- Scalars \(a_1, \dots, v_{m} \in \mathbb{F}\)
requirements
A Linear Combination is defined formally by:
\begin{equation} v = a_1v_1+\dots+a_{m}v_{m} \end{equation}
Linear Constant-Coefficient Equation
Last edited: August 8, 2025Here it is:
\begin{equation} a\frac{dy}{dx} + by = c \end{equation}
For some constants \(a,b,c\). The name is pretty obvious, because we have constants and the highest power on everything is \(1\). Its first-order because the derivative is only the first-order derivative.
linear (diffeq)
We technically call it “linear” because: if there are two possible solutions \(y_1(x)\) \(y_2(x)\), a linear combination \(Ay_1(x)+By_2(x)\) should also be a solution. Its “linear” because linear combinations work.