Axler 5.C
Last edited: August 8, 2025Key Sequence
- we defined an eigenspace, which is the space of all eigenvalues of a distinct eigenvector, and show that
- they form a direct sum to the whole space
- and, as a correlate to how direct sums are kind of like disjoint sets, we have the perhaps expected result of dimension, that the sum of the eigenspace’ dimensions must be smaller than or equal than that of \(V\)
- we defined a Diagonal Matrix, which by its structure + calculation can be shown to require that it is formed by a basis of eigenvalues
- and from there, and the properties of eigenspaces above, we deduce some conditions equal to diagonalizability
- a direct correlary of the last point (perhaps more straightforwardly intuited by just lining eigenvalue up diagonally in a matrix) is that enough eigenvalues implies diagonalizability
New Definitions
Results and Their Proofs
- eigenspaces are a direct sum
- dimension of sum of eigenspaces is smaller than or equal to the dimension of the whole space
- conditions equal to diagonalizability
- enough eigenvalues implies diagonalizability
Questions for Jana
- for diagonalizability, shouldn’t \(n\) be \(m\) on item 3?
Interesting Factoids
Short!
Axler 6.A
Last edited: August 8, 2025Hear ye, hear ye! Length and angles are a thing now!!
Key Sequence
- We remembered how dot products work, then proceeded to generalize them into inner products—this is needed because complex numbers don’t behave well when squared so we need to add in special guardrails
- We then learned that dot products is just an instantiation of Euclidean Inner Product, which itself is simply one of many inner products. A vector space that has a well-defined inner product is now called an Inner Product Space
- Along with revisiting our definition of dot products to include complexes, we changed our definition of norm to be in terms of inner products \(\sqrt{\langle v,v \rangle}\) to help support complex vector spaces better; we then also redefined orthogonality and showed a few results regarding them
- Then, we did a bunch of analysis-y work to understand some properties of norms and inner products: Pythagorean Theorem, Cauchy-Schwartz Inequality, triangle inequality and parallelogram equality.
New Definitions
- dot product and the more generally important…
- inner product!, Euclidean Inner Product, Inner Product Space
- norm
- orthogonal
- and now, a cornucopia of analysis
Results and Their Proofs
- properties of the dot product
- properties of inner product
- properties of the norm
- orthogonality and \(0\)
Questions for Jana
- How much of the analysis-y proof work do we have to remember for the analysis-y results?
Interesting Factoids
Axler 6.B
Last edited: August 8, 2025OMG its Gram-Schmidtting
Key Sequence
- we defined lists of vectors that all have norm 1 and are all orthogonal to each other as orthonormal; we showed orthonormal list is linearly independent by hijacking pythagoras
- of course, once we have a finitely long linearly independent thing we must be able to build a basis. The nice thing about such an orthonormal basis is that for every vector we know precisely what its coefficients have to be! Specifically, \(a_{j} = \langle v, e_{j} \rangle\). That’s cool.
- What we really want, though, is to be able to get an orthonormal basis from a regular basis, which we can do via Gram-Schmidt. In fact, this gives us some useful correlaries regarding the existance of orthonormal basis (just Gram-Schmidt a normal one), or extending a orthonormal list to a basis, etc. There are also important implications (still along the veins of “just Gram-Schmidt it!”) for upper-traingular matricies as well
- We also learned, as a result of orthonormal basis, any finite-dimensional linear functional (Linear Maps to scalars) can be represented as an inner product via the Riesz Representation Theorem, which is honestly kinda epic.
New Definitions
- orthonormal + orthonormal basis
- Gram-Schmidt (i.e. orthonormalization)
- linear functional and Riesz Representation Theorem
Results and Their Proofs
- Norm of an Orthogonal Linear Combination
- An orthonormal list is linearly independent
- An orthonormal list of the right length is a basis
- Writing a vector as a linear combination of orthonormal basis
- Corollaries of Gram-Schmidt
- Riesz Representation Theorem
Questions for Jana
Interesting Factoids
Axler 7.A
Last edited: August 8, 2025This is not actually like a proper review of a chapter, instead, it is an opinionated review of what I think Jana thinks Axler thinks is important about 7.A.
Note that all of the “proofy things” in this section are poofy because problems with putting trips prior to the end of the year.
Here’s an outline:
- We defined the adjoint
- We learned some properties of the adjoint; importantly, that \((A+B)^{*} = A^{*} + B^{*}\), \((AB)^{*} = B^{*} A^{*}\), \((\lambda T)^{*} = \bar{\lambda}T^{*}\); a correlary is that \(M^{*}M\) is self-adjoint
- We defined normal, self-adjoint, and unitary
- With those definitions, we showed that eigenvalues of self-adjoint matricies are real
- Then, we created two mildly interesting intermediate results
- Over \(\mathbb{C}\), \(Tv\) is orthogonal to all \(v\) IFF \(T\) is the zero matrix
- Over \(\mathbb{R}\), \(Tv\) is orthogonal to all \(v\) and \(T\) is self-adjoint, then \(T\) is the zero matrix
- The latter of which shows that Eigenvectors of \(T\) corresponding to distinct eigenvalues are orthogonal if \(T \in \mathcal{L}(V)\) is normal
- This, all, builds up to the result of the Complex Spectral Theorem, which you should know
adjoint
Suppose \(T \in \mathcal{L}(V,W)\), we define the adjoint as a \(T^{*} \in \mathcal{L}(W,V)\) that:
Backpacks
Last edited: August 8, 2025AAAAA I want a good backpack.
requirements
- explicit laptop compartment (whether intentional or not; water bladder component that fits a laptop is fine)
- earbags (those fannypack things on the side of the bottom belt); needs to be large (i.e. enough to fit an iphone 5)
- raincover
- at least 3 compartments, ideally one with a pen holder and key ring, and the outermost being very accessible (think mesh bag)
basically I want an exact replica of the columbia silver ridge 30L from 2012 which they don’t sell anymore; the new one breaks 4) slightly and is also $150 and I got mine for ilke $60-70 (it was like 300-350 rmb) max in 2012
