Axler 3.D
Last edited: August 8, 2025isomorphisms. Somebody’s new favourite word since last year.
Key Sequence
- we showed that a linear map’s inverse is unique, and so named the inverse \(T^{-1}\)
- we then showed an important result, that injectivity and surjectivity implies invertability
- this property allowed us to use invertable maps to define isomorphic spaces, naming the invertable map between them as the isomorphism
- we see that having the same dimension is enough to show invertability (IFF), because we can use basis of domain to map the basis of one space to another
- we then use that property to establish that matricies and linear maps have an isomorphism between them: namely, the matrixify operator \(\mathcal{M}\).
- this isomorphism allow us to show that the dimension of a set of Linear Maps is the product of the dimensions of their domain and codomain (that \(\dim \mathcal{L}(V,W) = (\dim V)(\dim W)\))
- We then, for some unknown reason, decided that right this second we gotta define matrix of a vector, and that linear map applications are like matrix multiplication because of it. Not sure how this relates
- finally, we defined a Linear Map from a space to itself as an operator
- we finally show an important result that, despite not being true for infinite-demensional vector space, injectivity is surjectivity in finite-dimensional operators
New Definitions
Results and Their Proofs
- linear map inverse is unique
- injectivity and surjectivity implies invertability
- two vector spaces are isomorphic IFF they have the same dimension
- matricies and Linear Maps from the right dimensions are isomorphic
- \(\dim \mathcal{L}(V,W) = (\dim V)(\dim W)\)
- \(\mathcal{M}(T)_{.,k} = \mathcal{M}(Tv_{k})\), a result of how everything is defined (see matrix of a vector)
- linear maps are like matrix multiplication
- injectivity is surjectivity in finite-dimensional operators
Questions for Jana
why doesn’t axler just say the “basis of domain” directly (i.e. he did a lin comb instead) for the second direction for the two vector spaces are isomorphic IFF they have the same dimension proof?because the next steps for spanning (surjectivity) and linear independence (injectivity) is made more obviousclarify the matricies and Linear Maps from the right dimensions are isomorphic proofwhat is the “multiplication by \(x^{2}\)” operator?literally multiplying by \(x^{2}\)how does the matrix of a vector detour relate to the content before and after? I suppose an isomorphism exists but it isn’t explicitly used in the linear maps are like matrix multiplication proof, which is the whole pointbecause we needed to close the loop of being able to linear algebra with matricies completely, which we didn’t know without the isomorphism between matricies and maps
Interesting Factoids
Axler 3.E
Last edited: August 8, 2025No idea why this is so long!!!
Key Sequence
Firehose of a chapter.
- We first began an unrelated exploration in Product of Vector Spaces (“tuples”):
- we show that the Product of Vector Spaces is a vector space
- because you can build a list out of zeroing every element except each one on each basis of each element of the tuple sequentially, we learned that the dimension of the Product of Vector Spaces is the sum of the spaces’ dimension.
- we defined the product-to-sum map \(\Gamma\)
- We then tackled the fun part of this chapter, which is affine subsets, parallel structures, quotient spaces, quotient map (affine subsetification maps)
- we learned an important and useful result that two affine subsets parallel to \(U\) are either equal or disjoint (\(v-w \in U\) means \(v+U = w+U\) means \(v+U \cap w+U \neq \emptyset\), means the first thing)
- we defined the operations on quotient space, and showed that quotient space operations behave uniformly on equivalent affine subsets. This, and the usual closer proof, demonstrates that quotient spaces is a vector space
- with the help of the affine subsetification map (the quotient map \(\pi\)), we show that the dimension of a quotient space is the difference between dimensions of its constituents essentially by invoking rank-nullity theorem after knowing the fact that \(null\ \pi = U\) (because \(u+U\) is an affine subset that has not been shifted (think about a line moving along itself… it doesn’t move))
- Then, and I’m not quite sure why, we defined \(\widetilde{T}: V / null\ T \to W\), for some \(T: V\to W\), defined as \(\widetilde{T}(v+null\ T) = Tv\).
- We show that the map is Linear, injective, its range is \(range\ T\), and so it forms an isomorphism between \(V / null\ T\) and \(range\ T\).
Here’s something: products and quotients, the intuition
Axler 3.F
Last edited: August 8, 2025Because duality is fun and I’m bored and houjun-being-obtuse.
Key Sequence
New Definitions
Results and Their Proofs
Questions for Jana
Interesting Factoids
Hello from onboard NH107! Or perhaps my next connecting flight, or from China.
Axler 5.A
Last edited: August 8, 2025EIGENSTUFF and OPERATORS! Invariant subspaces are nice.
Sometimes, if we can break the domain of a linear map down to its eigenvalues, we can understand what its doing on a component-wise level.
Key Sequence
- we defined an invariant subspace, and gave a name to 1-D invariant subspaces: the span of eigenvectors
- we showed some properties of eigenvalues and showed that a list of eigenvectors are linearly independent
- a correlate of this is that operators on finite dimensional V has at most dim V eigenvalues
- finally, we defined map restriction operator and quotient operator, and showed that they were well-defined
New Definitions
- invariant subspace
- conditions for nontrivial invariant subspace
- eigenvalues + eigenvectors + eigenspace
- two new operators: map restriction operator and quotient operator
Results and Their Proofs
- properties of eigenvalues
- list of eigenvectors are linearly independent
- quotient operator is well-defined
Questions for Jana
Interesting Factoids
“eigenvalue” is sometimes called the “characterizing value” of a map
Axler 5.B
Last edited: August 8, 2025Key Sequence
- we began the chapter defining \(T^m\) (reminding ourselves the usual rules of \(T^{m+n} = T^{m}T^{n}\), \((T^{m})^{n} = T^{mn}\), and, for invertible maps, \(T^{-m} = (T^{-1})^{m}\)) and \(p(T)\), wrapping copies of \(T\) into coefficients of a polynomial, and from those definitions showed that polynomial of operator is commutative
- we then used those results + fundamental theorem of algebra to show that operators on complex vector spaces have an eigenvalue
- that previous, important result in hand, we then dove into upper-triangular matricies
- specifically, we learned the properties of upper-triangular matrix, that if \(v_1 … v_{n}\) is a basis of \(V\) then \(\mathcal{M}(T)\) is upper-triangular if \(Tv_{j} \in span(v_1, … v_{j})\) for all \(j \leq n\); and, equivalently, \(T\) in invariant under the span of \(v_{j}\)
- using that result, we show that every complex operator has an upper-triangular matrix
- using some neat tricks of algebra, we then establish that operator is only invertible if diagonal of its upper-triangular matrix is nonzero, which seems awfully unmotivated until you learn that…
- eigenvalues of a map are the entries of the diagonal of its upper-triangular matrix, and that basically is a direct correlary from the upper-triangular matrix of \(T-\lambda I\)
New Definitions
- \(T^m\)
- \(p(T)\)
- technically also product of polynomials
- matrix of an operator
- diagonal of a matrix
- upper-triangular matrix
Results and Their Proofs
- \(p(z) \to p(T)\) is a linear function
- polynomial of operator is commutative
- operators on complex vector spaces have an eigenvalue
- properties of upper-triangular matrix
- every complex operator has an upper-triangular matrix
- operator is only invertible if diagonal of its upper-triangular matrix is nonzero
- eigenvalues of a map are the entries of the diagonal of its upper-triangular matrix
Questions for Jana
why define the matrix of an operator again??just to stress that its square- for the second flavor of the proof that every complex operator has an upper-triangular matrix, why is \(v_1 … v_{j}\) a basis of \(V\)?
Interesting Factoids
Its 12:18AM and I read this chapter for 5 hours. I also just got jumpscared by my phone notification. What’s happening?
