A basis is a list of vectors in \(V\) that spans \(V\) and is linearly independent

## constituents

- a LIST! of vectors in vector space \(V\)

## requirements

- the list is…
- linear independent
- spans \(V\)

## additional information

### criteria for basis

A list \(v_1, \dots v_{n}\) of vectors in \(V\) is a basis of \(V\) IFF every \(v \in V\) can be written uniquely as:

\begin{equation} v = a_1v_1+ \dots + a_{n}v_{n} \end{equation}

where \(a_1, \dots, a_{n} \in \mathbb{F}\).

#### forward direction

Suppose we have \(v_1, \dots, v_{n}\) as the basis in \(V\). We desire that \(v_1, \dots v_{n}\) uniquely constructs each \(v \in V\).

By definition, they span \(V\) and are linear independent in \(V\).

Because of the spanning quality, there exists *at least* one set of \(a_1, \dots, a_{n} \in \mathbb{F}\) such that we can write:

\begin{equation} v \in V = a_1v_1+ \dots + a_{n}v_{n} \end{equation}

Suppose now that we have another representation of \(v\) via scalars \(c_1, \dots, c_{n}\) and our same list of vectors:

\begin{equation} v \in V =^{?} c_1v_1+ \dots + c_{n}v_{n} \end{equation}

Subtracting the two expressions, we have that:

\begin{equation} 0 = (a_1-c_1)v_1 + \dots +(a_{n}-c_{n}) v_{n} \end{equation}

By definition that \(v_1 \dots v_{n}\) is linearly independent, we have that \(a_j-c_j=0 \implies a_{j}=c_{j}\). Therefore, there is only one unique representation for \(v\) as a linear combination of vectors \(v_1, \dots v_{n}\).

(to be honest, we could have just applied that as the definition of linear independence that the scalars in a linear combo of linearly independent list is unique but this is the more careful definition.)

#### backward direction

Suppose we have a list \(v_1, \dots v_{n}\) which uniquely constructs each \(v \in V\). We desire that \(v_1, \dots v_{n}\) is a basis in \(V\). Given a linear combination thereof can construct all \(v \in V\), we can say that \(v_1, \dots v_{n}\) spans \(V\).

As \(V\) is a vector space, we have \(0 \in V\). Therefore, there exists some scalars \(a_1, \dots a_{n}\) for which:

\begin{equation} 0 = a_1v_1 + \dots +a_{n}v_{n} \end{equation}

(as we already established \(v_1, \dots, v_{n}\) spans \(V\) and \(0 \in V\))

Of course, we are given that \(v_1, \dots v_{n}\) uniquely constructs each \(v \in V\). As the trivial solution *does* exist: that \(a_1 = \dots = a_{n} = 0\), it is the only solution.

By definition of linear independence, then, \(v_1, \dots v_{n}\) is linearly independent. Having constructed that \(v_1, \dots v_{n}\) is both a spanning set in \(V\) and are linearly independent, we have that they are a basis of \(V\). \(\blacksquare\)

### Dualing Basis Construction

These are two results that says: “you can build up a linearly independent list to a basis or you can pluck away a spanning list to a basis”.

#### all spanning lists contains a basis of which you are spanning

Every spanning list in \(V\) contains the basis (and possibly some more) in \(V\).

Read: “apply Linear Dependence Lemma your way to success”.

Begin with a spanning list \(v_1, \dots v_{m}\) of \(V\). We run a for loop for the list.

Step 0:

If \(v_1=0\) (i.e. \(v_1 \in span(\{\})\)), delete \(v_1\). Otherwise, do nothing.

Step \(j\):

If \(v_{j}\) is in \(span(v_1, \dots v_{j-1})\), \(v_{j}\) satisfies the Linear Dependence Lemma’s first condition, and therefore naturally satisfies the second condition (removal from list keeps the same span because \(v_{j}\) can just be rewritten from \(v_1, \dots v_{j-1}\)).

So we remove \(v_{j}\) if it is indeed in the span of the previous vectors. By the Linear Dependence Lemma, the new list spans the same space the old list.

Conclusion

By the end of this process, no vectors left in the list will satisfy the Linear Dependence Lemma (read: we got rid of all of them.) Therefore, the list is linearly independent. However, every step of the way the Linear Dependence Lemma ensures that the new list spans the same space; therefore, the new list still spans \(V\). Having constructed a linearly independent list that spans \(V\), we declare the new list as a basis of \(V\).

As all we did was pluck vectors out of the old list, the new list is a sublist of the old list. This means that the spanning list (old list) contains the new list, which is a basis. \(\blacksquare\)

#### a linearly independent list expends to a basis

Every linearly independent list of vectors in finite-dimensional vector spaces can be extended to a basis.

Recall first that every finite-dimensional vector space has a basis.

Let’s begin with a linearly independent list in \(V\) \(u_1, \dots u_{m}\). Let’s recruit also a basis of \(V\): \(w_{1}, \dots w_{m}\).

Naturally: \(u_1, \dots u_{m}, w_1, \dots w_{m}\) spans \(V\) (as the \(w\) vectors already span \(V\)). We will now apply the fact that all spanning lists contains a basis of which you are spanning (the order of \(u\) vectors first and \(w\) vectors second ensuring that you try to remove the \(w\), and, as \(u\) are linearly independent, none of them will be removed) to get back a basis in \(V\) consisting of all \(u\) and some \(w\). \(\blacksquare\)