_index.org

Linear Algebra Errors

Last edited: August 8, 2025

Gaussian Elimination Quiz

  • Demonstrate that matrices’ multiplication are not commutative (error: didn’t consider \(m\times m\))
  • Which \(2\times 2\) matrices under multiplication form a group? (error: closure need to proved on invertable matrices under multiplication, not just \(2\times 2\))
  • Deriving Rotation matrices (error: clockwise vs counter-clockwise)

Linear Independence Quiz

  • Connection between linear independence and systems equations (error: beated around the bush) — the matrix of an nxn system of equations has a solution if the matrix’s column vectors is linearly independent

Basis and Dimension Quiz

  • put 0 into a basis AAAA not lin. indep; figure out what the basis for a polynomial with a certain root is: it is probably of dimension m (instead of m+1), because scalars doesn’t work in the case of p(3)=0; so basis is just the scalars
  • missing some inequality about basis? — its just that lin.idp sets is shorter or equal to basis and spanning sets is longer or equal to basis

Final, part 1

  • definition of vector space: scalar multiplication is not an operation
  • straight forgot \(dim(U+V) = dim U + dim V - dim (U\cap V)\)
  • plane containing \((1,0,2)\) and \((3,-1,1)\): math mistake
  • proof: det A det B = det AB

Final, part 2

  • Counterproof: If \(v_1 \dots v_4\) is a basis of \(V\), and \(U\) is a subspace of \(V\) with \(v_1, v_2 \in U\) and \(v_3, v_4\) not in \(U\), \(v_1, v_2\) is a basis of \(U\)
  • Counterproof: if \(T \in \mathcal{L}(V,V)\) and \(T^{2}=0\), then \(T=0\)
  • Counterproof: if \(s,t \in \mathcal{L}(V,V)\), and \(ST=0\), then \(null\ s\) is contained in \(range\ T\)

Product Spaces Quiz

  • Need more specific description: explain why we use product and quotient to describe product and quotient spaces?
  • Prove that \(\mathcal{L}(V_1 \times V_2 \times \dots \times V_{m}, W)\) and \(\mathcal{L}(V_1, W) \times \dots \times \mathcal{L}(V_{m}, W)\) are isomorphic. Error: didn’t do it correctly for infinite dimensional

Quotient Spaces Quiz

  • Couldn’t prove that the list in linearly independent: the linear combinations is some \(c_1v_1 + \dots c_{m}v_{m} + U\); as \(v_1 \dots v_{m}\) is a basis of \(V / U\), \(c_1 \dots c_{m} = 0\), now the second part is also a basis so they are \(0\) too.
    • The spanning proof: \(v + U =\) , rewrite as basis, etc.
  • she graded wrong: what’s the importance of \(\widetilde{T}\)?
  • Give two statements equivalent to \(v+U = w+U\), prove equivalence betewen this statement and the others
    • didn’t prove both directions!

Polynomials Quiz

  • state the fundamental theorem of algebra; error: \(\mathcal{P}_{m}(\mathbb{F})\) is a vector space of polynomials with degree at most \(m\), and yet the FtOA requires exactly \(m\)

Upper Triangular Quiz

  • upper-triangular representation is findable when the space is 1) under complexes and 2) for finite-dimensional vector spaces; need BOTH conditions

Upper Triangular Quiz

  • UNCLEAR: Geometric Multipliicty is bounded by Algebric Multiplicity; Algebraic multiplicity (“real estate” taken on the upper-triangular diagonal) v. geometric multiplicity (amount of linearly independent eigenvectors included with that eigenvalue); so if geometric multiplicity < algebraic multiplicity, the map is not diagonalizable because its not bringing enough linearly independent eigenvectors

Diagonalization Quiz

  • enough eigenvalues go in only one direction: it existing means its diagonalizable, but the opposite isn’t true
  • the proof for \(T\) is diagonalizable IFF the matrix \(T\) is similar to a diagonal matrix: NUS-MATH530 Similar to Diagonal

Final, part 1

  • State the complex spectral theorem (error: the condition of normality is a PARALLEL result)

Final, Part 2

  • Said this was true, but its not; \(null\ T \bigoplus range\ T = V\), \(T\) is diagonalizable;
  • Said this was true, but its false \(T^{2}= 0\) IFF \(null\ T = range\ T\) suppose \(T=0\), \(T^{2} = 0\). \(null\ T = V\), \(range\ T = 0\).
  • Spectral theorem doesn’t define diagonalizability, it defines diagonalibility for ORTHONORMAL
  • missing derivation of the pseudoinverse

Linear Algebra Index

Last edited: August 8, 2025

The bible stays the same: (Axler 1997)

We will be less exploratory, Axler will pretty much tell us. However, we should try to say stuff in the class every single class period.

There is a ban on numbers over 4 on this class.

Best Practices

Non-Axler but Important

Things we explicitly are told to know, but is not immediately in Axler. You bet you determinants are going to be here.

linear combination

Last edited: August 8, 2025

A Linear Combination of vectors is a… guess what? Any vector formed by a combination of vectors at arbitrary scales.

constituents

  • A list of vectors \(v_1, \dots,v_{m}\)
  • Scalars \(a_1, \dots, v_{m} \in \mathbb{F}\)

requirements

A Linear Combination is defined formally by:

\begin{equation} v = a_1v_1+\dots+a_{m}v_{m} \end{equation}

Linear Constant-Coefficient Equation

Last edited: August 8, 2025

Here it is:

\begin{equation} a\frac{dy}{dx} + by = c \end{equation}

For some constants \(a,b,c\). The name is pretty obvious, because we have constants and the highest power on everything is \(1\). Its first-order because the derivative is only the first-order derivative.

linear (diffeq)

We technically call it “linear” because: if there are two possible solutions \(y_1(x)\) \(y_2(x)\), a linear combination \(Ay_1(x)+By_2(x)\) should also be a solution. Its “linear” because linear combinations work.

Linear Constraint Optimization

Last edited: August 8, 2025

\begin{align} \min_{x}\ &c^{\top} x \\ s.t.\ &Ax \leq b \\ & x \geq 0 \end{align}

  • linear objective function
  • linear constraints

single our inequality forms a half-space; the entire feasible set is denoted by a series of linear functions—-these linear equalities are each CONVEX. The resulting feasible set, then, is ALSO convex—-meaning any line within the set remains within the set. So, any local minimum is a global minimum.

3 cases of design points

  1. points on the interior of feasible set is always non-optimal, because we can always move along \(c\) gradient
  2. points on the faces could be optimal IFF the face is perpendicular to \(c\), the gradient of our objective function—but you can always slide along the face, making there be infinite solutions if its on a face (because \(c\) doesn’t change along that face)
  3. points on vertex could be optimal
  4. unclosed feasible set—possibly unbounded solution

linear program equality form

\begin{align} \min_{x}\ &c^{\top} x \\ s.t.\ &Ax = b \\ & x \geq 0 \end{align}