A result so important it gets a page.

Every operator on a finite-dimensional, non-zero, complex vector space has an eigenvalue.

Proof:

Suppose \(V\) is a complex vector space with dimension \(n > 0\), and \(T \in \mathcal{L}(V)\). Choose \(v \in V, v\neq 0\) (possible as \(V\) is non-zero):

Construct a list of \(n+1\) vectors:

\begin{equation} v, Tv, \dots T^{n} v \end{equation}

because we managed to cram \(n+1\) vectors into a list for a vector space with dimension \(n\), that list is linearly dependent.

And thus, by definition of linearly dependence, exists a set of \(a_0, … a_{n} \in \mathbb{C}\), which not all are \(0\), such that:

\begin{equation} 0 = a_0 v + a_1 T v + \dots + a_{n} T^{n} v \end{equation}

Note that, because \(v \neq 0\), \(a_{1} … a_{n}\) can’t all be \(0\) either because otherwise \(a_0 = 0\) making all \(a_{j}=0\).

Now, this polynomial can be completely factored because of the fundamental theorem of algebra into linear factors, \(a_{0} + a_{1}z + … a_{n}z^{n} = c(z-\lambda_{1}) \dots (z- \lambda_{m})\). We have to invoke the fundamental theorem of algebra with complex factors \(z\) because we haven’t shown it holds for polynomial operators yet.

However, the existence of such a complete factoring over the complex numbers means that, with possibly complex number \(\lambda_{j}\) values:

\begin{align} 0 &= a_{0} v + a_{1} Tv + \dots a_{n} T^{n} v \\ &= (a_{0} I + a_{1} T + \dots a_{n} T^{n}) v \\ &= c(T - \lambda_{1} I) \dots (T- \lambda_{m} I)v \end{align}

note that \(m\) is not necessarily \(n\) because different multiplicities.

Now, \(c\) cannot be \(0\) because \(a_0 \neq 0\), and multiplying everything out out… makes the ending not zero?

Given \(c \neq 0\), \(v \neq 0\), and yet the map maps \(v\) to \(0\), at least one of the maps has to be non-injective. And because the properties of eigenvalues, some \((T- \lambda_{j} I)\) being non-injective for a finite-dimensional vector space means that \(\lambda_{j}\) is an eigenvalue of \(T\). \(\blacksquare\)