Section 2.4 Linear independence
In the previous section, questions about the existence of solutions of a linear system led to the concept of the span of a set of vectors. In particular, the span of a set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is the set of vectors \(\bvec\) for which a solution to the linear system \(\left[\begin{array}{rrrr} \vvec_1\amp\vvec_2\amp\ldots\amp\vvec_n \end{array}\right] ~\xvec = \bvec \) exists.
In this section, we turn to the uniqueness of solutions of a linear system, the second of our two fundamental questions. This will lead us to the concept of linear independence.
Preview Activity 2.4.1.
Let's begin by looking at some sets of vectors in \(\real^3\text{.}\) As we saw in the previous section, the span of a set of vectors in \(\real^3\) will be either a line, a plane, or \(\real^3\) itself.

Consider the following vectors in \(\real^3\text{:}\)
\begin{equation*} \vvec_1=\threevec{0}{1}{2}, \vvec_2=\threevec{3}{1}{1}, \vvec_3=\threevec{2}{0}{1}\text{.} \end{equation*}Describe the span of these vectors, \(\laspan{\vvec_1,\vvec_2,\vvec_3}\text{,}\) as a line, a plane, or \(\real^3\text{.}\)

Now consider the set of vectors:
\begin{equation*} \wvec_1=\threevec{0}{1}{2}, \wvec_2=\threevec{3}{1}{1}, \wvec_3=\threevec{3}{0}{1}\text{.} \end{equation*}Describe the span of these vectors, \(\laspan{\wvec_1,\wvec_2,\wvec_3}\text{,}\) as a line, a plane, or \(\real^3\text{.}\)

Show that the vector \(\wvec_3\) is a linear combination of \(\wvec_1\) and \(\wvec_2\) by finding weights such that
\begin{equation*} \wvec_3 = c\wvec_1 + d\wvec_2\text{.} \end{equation*} 
Explain why any linear combination of \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) and \(\wvec_3\text{,}\)
\begin{equation*} c_1\wvec_1 + c_2\wvec_2 + c_3\wvec_3 \end{equation*}can be written as a linear combination of \(\wvec_1\) and \(\wvec_2\text{.}\)

Explain why
\begin{equation*} \laspan{\wvec_1,\wvec_2,\wvec_3} = \laspan{\wvec_1,\wvec_2}\text{.} \end{equation*}
Subsection 2.4.1 Linear dependence
We have seen examples where the span of a set of three vectors in \(\real^3\) is \(\real^3\) and other examples where the span of three vectors is a plane. We would like to understand the difference between these two situations.
Example 2.4.1.
Let's consider the set of three vectors in \(\real^3\text{:}\)
Forming the associated matrix gives
Because there is a pivot position in every row, Proposition 2.3.14 tells us that \(\laspan{\vvec_1, \vvec_2, \vvec_3} = \real^3\text{.}\)
Example 2.4.2.
Now let's consider the set of three vectors:
Forming the associated matrix gives
Since the last row does not have a pivot position, we know that the span of these vectors is not \(\real^3\) but is instead a plane.
In fact, we can say more if we shift our perspective slightly and view this as an augmented matrix:
In this way, we see that \(\wvec_3 = 2\wvec_1  \wvec_2\text{,}\) which enables us to rewrite any linear combination of these three vectors:
In other words, any linear combination of \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) and \(\wvec_3\) may be written as a linear combination using only the vectors \(\wvec_1\) and \(\wvec_2\text{.}\) Since the span of a set of vectors is simply the set of their linear combinations, this shows that
As a result, adding the vector \(\wvec_3\) to the set of vectors \(\wvec_1\) and \(\wvec_2\) does not change the span.
Before exploring this type of behavior more generally, let's think about it from a geometric point of view. Suppose that we begin with the two vectors \(\vvec_1\) and \(\vvec_2\) in Example 2.4.1. The span of these two vectors is a plane in \(\real^3\text{,}\) as seen on the left of Figure 2.4.3.
Because the vector \(\vvec_3\) is not a linear combination of \(\vvec_1\) and \(\vvec_2\text{,}\) it provides a direction to move that is independent of \(\vvec_1\) and \(\vvec_2\text{.}\) Adding this third vector \(\vvec_3\) therefore forms a set whose span is \(\real^3\text{,}\) as seen on the right of Figure 2.4.3.
Similarly, the span of the vectors \(\wvec_1\) and \(\wvec_2\) in Example 2.4.2 is also a plane. However, the third vector \(\wvec_3\) is a linear combination of \(\wvec_1\) and \(\wvec_2\text{,}\) which means that it already lies in the plane formed by \(\wvec_1\) and \(\wvec_2\text{,}\) as seen in Figure 2.4.4. Since we can already move in this direction using just \(\wvec_1\) and \(\wvec_2\text{,}\) adding \(\wvec_3\) to the set does not change the span. As a result, it remains a plane.
What distinguishes these two examples is whether one of the vectors is a linear combination of the others, an observation that leads to the following definition.
Definition 2.4.5.
A set of vectors is called linearly dependent if one of the vectors is a linear combination of the others. Otherwise, the set of vectors is called linearly independent.
For the sake of completeness, we say that a set of vectors containing only one nonzero vector is linearly independent.
Subsection 2.4.2 How to recognize linear dependence
Activity 2.4.2.
We would like to develop a means to detect when a set of vectors is linearly dependent. This activity will point the way.

Suppose we have five vectors in \(\real^4\) that form the columns of a matrix having reduced row echelon form
\begin{equation*} \left[\begin{array}{rrrrr} \vvec_1 \amp \vvec_2 \amp \vvec_3 \amp \vvec_4 \amp \vvec_5 \end{array}\right] \sim \left[\begin{array}{rrrrr} 1 \amp 0 \amp 1 \amp 0 \amp 2 \\ 0 \amp 1 \amp 2 \amp 0 \amp 3 \\ 0 \amp 0 \amp 0 \amp 1 \amp 1 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \\ \end{array}\right]\text{.} \end{equation*}Is it possible to write one of the vectors \(\vvec_1,\vvec_2,\ldots,\vvec_5\) as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?

Suppose we have another set of three vectors in \(\real^4\) that form the columns of a matrix having reduced row echelon form
\begin{equation*} \left[\begin{array}{rrr} \wvec_1 \amp \wvec_2 \amp \wvec_3 \\ \end{array}\right] \sim \left[\begin{array}{rrr} 1 \amp 0 \amp 0 \\ 0 \amp 1 \amp 0 \\ 0 \amp 0 \amp 1 \\ 0 \amp 0 \amp 0 \\ \end{array}\right]\text{.} \end{equation*}Is it possible to write one of these vectors \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) \(\wvec_3\) as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?
By looking at the pivot positions, how can you determine whether the columns of a matrix are linearly dependent or independent?
If one vector in a set is the zero vector \(\zerovec\text{,}\) can the set of vectors be linearly independent?
Suppose a set of vectors in \(\real^{10}\) has twelve vectors. Is it possible for this set to be linearly independent?
By now, we should expect that the pivot positions play an important role in determining whether the columns of a matrix are linearly dependent. For instance, suppose we have four vectors and their associated matrix
Since the third column does not contain a pivot position, let's just focus on the first three columns and view them as an augmented matrix:
This says that
which tells us that the set of vectors \(\vvec_1,\vvec_2,\vvec_3,\vvec_4\) is linearly dependent. Moreover, we see that
More generally, the same reasoning implies that a set of vectors is linearly dependent if the associated matrix has a column without a pivot position. Indeed, as illustrated here, a vector corresponding to a column without a pivot position can be expressed as a linear combination of the vectors whose columns do contain pivot positions.
Suppose instead that the matrix associated to a set of vectors has a pivot position in every column.
Viewing this as an augmented matrix again, we see that the linear system is inconsistent since there is a pivot in the rightmost column, which means that \(\wvec_4\) cannot be expressed as a linear combination of the other vectors. Similarly, \(\wvec_3\) cannot be expressed as a linear combination of \(\wvec_1\) and \(\wvec_2\text{.}\) In fact, none of the vectors can be written as a linear combination of the others so this set of vectors is linearly independent.
The following proposition summarizes these findings.
Proposition 2.4.6.
The columns of a matrix are linearly independent if and only if every column contains a pivot position.
This condition imposes a constraint on how many vectors we can have in a linearly independent set. Here is an example of the reduced row echelon form of a matrix whose columns form a set of three linearly independent vectors in \(\real^5\text{:}\)
Notice that there are at least as many rows as columns, which must be the case if every column is to have a pivot position.
More generally, if \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linearly independent set of vectors in \(\real^m\text{,}\) the associated matrix must have a pivot position in every column. Since every row contains at most one pivot position, the number of columns can be no greater than the number of rows. This means that the number of vectors in a linearly independent set can be no greater than the number of dimensions.
Proposition 2.4.7.
A linearly independent set of vectors in \(\real^m\) contains at most \(m\) vectors.
This says, for instance, that any linearly independent set of vectors in \(\real^3\) can contain no more three vectors. We usually imagine three independent directions, such as up/down, front/back, left/right, in our threedimensional world. This proposition tells us that there can be no more independent directions.
The proposition above says that a set of vectors in \(\real^m\) that is linear independent has at most \(m\) vectors. By comparison, Proposition 2.3.15 says that a set of vectors whose span is \(\real^m\) has at least \(m\) vectors.
Subsection 2.4.3 Homogeneous equations
If \(A\) is a matrix, we call the equation \(A\xvec = \zerovec\) a homogeneous equation. As we'll see, the uniqueness of solutions to this equation reflects on the linear independence of the columns of \(A\text{.}\)
Activity 2.4.3. Linear independence and homogeneous equations.
Explain why the homogeneous equation \(A\xvec = \zerovec\) is consistent no matter the matrix \(A\text{.}\)

Consider the matrix
\begin{equation*} A = \left[\begin{array}{rrr} 3 \amp 2 \amp 0 \\ 1 \amp 0 \amp 2 \\ 2 \amp 1 \amp 1 \end{array}\right] \end{equation*}whose columns we denote by \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\text{.}\) Describe the solution space of the homogeneous equation \(A\xvec = \zerovec\) using a parametric description, if appropriate.

Find a nonzero solution to the homogeneous equation and use it to find weights \(c_1\text{,}\) \(c_2\text{,}\) and \(c_3\) such that
\begin{equation*} c_1\vvec_1 + c_2\vvec_2 + c_3\vvec_3 = \zerovec\text{.} \end{equation*} Use the equation you found in the previous part to write one of the vectors as a linear combination of the others.
Are the vectors \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\) linearly dependent or independent?
This activity shows how the solution space of the homogeneous equation \(A\xvec = \zerovec\) indicates whether the columns of \(A\) are linearly dependent or independent. First, we know that the equation \(A\xvec = \zerovec\) always has at least one solution, the vector \(\xvec = \zerovec\text{.}\) Any other solution is a nonzero solution.
Example 2.4.8.
Let's consider the vectors
and their associated matrix \(A = \begin{bmatrix} \vvec_1 \amp \vvec_2 \amp \vvec_3 \end{bmatrix} \text{.}\)
The homogeneous equation \(A\xvec = \zerovec\) has the associated augmented matrix
Therefore, \(A\) has a column without a pivot position, which tells us that the vectors \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\) are linearly dependent. However, we can also see this fact in another way.
The reduced row echelon matrix tells us that the homogeneous equation has a free variable so that there must be infinitely many solutions. In particular, we have
so the solutions have the form
If we choose \(x_3=1\text{,}\) then we obtain the nonzero solution to the homogeneous equation \(\xvec = \threevec{1}{1}1\text{,}\) which implies that
In other words,
Because \(\vvec_3\) is a linear combination of \(\vvec_1\) and \(\vvec_2\text{,}\) we know that this set of vectors is linearly dependent.
As this example demonstrates, there are many ways we can view the question of linear independence, some of which are recorded in the following proposition.
Proposition 2.4.9.
For a matrix \(A = \left[\begin{array}{rrrr} \vvec_1\amp\vvec_2\amp\ldots\amp\vvec_n \end{array}\right] \text{,}\) the following statements are equivalent:
The columns of \(A\) are linearly dependent.
One of the vectors in the set \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linear combination of the others.
The matrix \(A\) has a column without a pivot position.
The homogeneous equation \(A\xvec = \zerovec\) has infinitely many solutions and hence a nonzero solution.

There are weights \(c_1,c_2,\ldots,c_n\text{,}\) not all of which are zero, such that
\begin{equation*} c_1\vvec_1 + c_2\vvec_2 + \ldots + c_n\vvec_n = \zerovec\text{.} \end{equation*}
Subsection 2.4.4 Summary
This section developed the concept of linear dependence of a set of vectors. More specifically, we saw that:
A set of vectors is linearly dependent if one of the vectors is a linear combination of the others.
A set of vectors is linearly independent if and only if the vectors form a matrix that has a pivot position in every column.
A set of linearly independent vectors in \(\real^m\) contains no more than \(m\) vectors.
The columns of the matrix \(A\) are linearly dependent if the homogeneous equation \(A\xvec = \zerovec\) has a nonzero solution.

A set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is linearly dependent if there are weights \(c_1,c_2,\ldots,c_n\text{,}\) not all of which are zero, such that
\begin{equation*} c_1\vvec_1 + c_2\vvec_2 + \ldots + c_n\vvec_n = \zerovec\text{.} \end{equation*}
At the beginning of the section, we said that this concept addressed the second of our two fundamental questions concerning the uniqueness of solutions to a linear system. It is worth comparing the results of this section with those of the previous one so that the parallels between them become clear.
As usual, we will write a matrix as a collection of vectors,
Span  Linear independence 

A vector \(\bvec\) is in the span of a set of vectors if it is a linear combination of those vectors. 
A set of vectors is linearly dependent if one of the vectors is a linear combination of the others. 
A vector \(\bvec\) is in the span of \(\vvec_1, \vvec_2, \ldots, \vvec_n\) if there exists a solution to \(A\xvec = \bvec\text{.}\) 
The vectors \(\vvec_1, \vvec_2, \ldots, \vvec_n\) are linearly independent if \(\xvec=\zerovec\) is the unique solution to \(A\xvec = \zerovec\text{.}\) 
The columns of an \(m\times n\) matrix span \(\real^m\) if the matrix has a pivot position in every row. 
The columns of a matrix are linearly independent if the matrix has a pivot position in every column. 
A set of vectors that span \(\real^m\) has at least \(m\) vectors. 
A set of linearly independent vectors in \(\real^m\) has at most \(m\) vectors. 
Exercises 2.4.5 Exercises
1.
Consider the set of vectors
Explain why this set of vectors is linearly dependent.
Write one of the vectors as a linear combination of the others.

Find weights \(c_1\text{,}\) \(c_2\text{,}\) \(c_3\text{,}\) and \(c_4\text{,}\) not all of which are zero, such that
\begin{equation*} c_1\vvec_1 + c_2 \vvec_2 + c_3 \vvec_3 + c_4 \vvec_4 = \zerovec\text{.} \end{equation*} Find a nonzero solution to the homogenous equation \(A\xvec = \zerovec\) where \(A=\left[\begin{array}{rrrr} \vvec_1\amp\vvec_2\amp\vvec_3\amp\vvec_4 \end{array}\right]\text{.}\)
2.
Consider the vectors
Are these vectors linearly independent or linearly dependent?
Describe the \(\laspan{\vvec_1,\vvec_2,\vvec_3}\text{.}\)
Suppose that \(\bvec\) is a vector in \(\real^3\text{.}\) Explain why we can guarantee that \(\bvec\) may be written as a linear combination of \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\text{.}\)
Suppose that \(\bvec\) is a vector in \(\real^3\text{.}\) In how many ways can \(\bvec\) be written as a linear combination of \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\text{?}\)
3.
Respond to the following questions and provide a justification for your responses.
If the vectors \(\vvec_1\) and \(\vvec_2\) form a linearly dependent set, must one vector be a scalar multiple of the other?
Suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linearly independent set of vectors. What can you say about the linear independence or dependence of a subset of these vectors?
Suppose \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linearly independent set of vectors that form the columns of a matrix \(A\text{.}\) If the equation \(A\xvec = \bvec\) is inconsistent, what can you say about the linear independence or dependence of the set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n,\bvec\text{?}\)
4.
Determine whether the following statements are true or false and provide a justification for your response.
If \(\vvec_1,\vvec_2,\ldots,\vvec_n\) are linearly dependent, then one vector is a scalar multiple of one of the others.
If \(\vvec_1, \vvec_2, \ldots, \vvec_{10}\) are vectors in \(\real^5\text{,}\) then the set of vectors is linearly dependent.
If \(\vvec_1, \vvec_2, \ldots, \vvec_{5}\) are vectors in \(\real^{10}\text{,}\) then the set of vectors is linearly independent.
Suppose we have a set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) and that \(\vvec_2\) is a scalar multiple of \(\vvec_1\text{.}\) Then the set is linearly dependent.
Suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) are linearly independent and form the columns of a matrix \(A\text{.}\) If \(A\xvec = \bvec\) is consistent, then there is exactly one solution.
5.
Suppose we have a set of vectors \(\vvec_1,\vvec_2,\vvec_3,\vvec_4\) in \(\real^5\) that satisfy the relationship:
and suppose that \(A\) is the matrix \(A=\left[\begin{array}{rrrr} \vvec_1\amp\vvec_2\amp\vvec_3\amp\vvec_4 \end{array}\right] \text{.}\)
Find a nonzero solution to the equation \(A\xvec = \zerovec\text{.}\)
Explain why the matrix \(A\) has a column without a pivot position.
Write one of the vectors as a linear combination of the others.
Explain why the set of vectors is linearly dependent.
6.
Suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a set of vectors in \(\real^{27}\) that form the columns of a matrix \(A\text{.}\)
Suppose that the vectors span \(\real^{27}\text{.}\) What can you say about the number of vectors \(n\) in this set?
Suppose instead that the vectors are linearly independent. What can you say about the number of vectors \(n\) in this set?
Suppose that the vectors are both linearly independent and span \(\real^{27}\text{.}\) What can you say about the number of vectors in the set?
Assume that the vectors are both linearly independent and span \(\real^{27}\text{.}\) Given a vector \(\bvec\) in \(\real^{27}\text{,}\) what can you say about the solution space to the equation \(A\xvec = \bvec\text{?}\)
7.
Given below are some descriptions of sets of vectors that form the columns of a matrix \(A\text{.}\) For each description, give a possible reduced row echelon form for \(A\) or indicate why there is no set of vectors satisfying the description by stating why the required reduced row echelon matrix cannot exist.
A set of 4 linearly independent vectors in \(\real^5\text{.}\)
A set of 4 linearly independent vectors in \(\real^4\text{.}\)
A set of 3 vectors whose span is \(\real^4\text{.}\)
A set of 5 linearly independent vectors in \(\real^3\text{.}\)
A set of 5 vectors whose span is \(\real^4\text{.}\)
8.
When we explored matrix multiplication in Section 2.2, we saw that some properties that are true for real numbers are not true for matrices. This exercise will investigate that in some more depth.
Suppose that \(A\) and \(B\) are two matrices and that \(AB = 0\text{.}\) If \(B \neq 0\text{,}\) what can you say about the linear independence of the columns of \(A\text{?}\)
Suppose that we have matrices \(A\text{,}\) \(B\) and \(C\) such that \(AB = AC\text{.}\) We have seen that we cannot generally conclude that \(B=C\text{.}\) If we assume additionally that \(A\) is a matrix whose columns are linearly independent, explain why \(B = C\text{.}\) You may wish to begin by rewriting the equation \(AB = AC\) as \(ABAC = A(BC) = 0\text{.}\)
9.
Suppose that \(k\) is an unknown parameter and consider the set of vectors
For what values of \(k\) is the set of vectors linearly dependent?
For what values of \(k\) does the set of vectors span \(\real^3\text{?}\)
10.
Given a set of linearly dependent vectors, we can eliminate some of the vectors to create a smaller, linearly independent set of vectors.
Suppose that \(\wvec\) is a linear combination of the vectors \(\vvec_1\) and \(\vvec_2\text{.}\) Explain why \(\laspan{\vvec_1,\vvec_2, \wvec} = \laspan{\vvec_1,\vvec_2}\text{.}\)

Consider the vectors
\begin{equation*} \vvec_1 = \threevec{2}{1}{0}, \vvec_2 = \threevec{1}{2}{1}, \vvec_3 = \threevec{2}{6}{2}, \vvec_4 = \threevec{7}{1}{1}\text{.} \end{equation*}Write one of the vectors as a linear combination of the others. Find a set of three vectors whose span is the same as \(\laspan{\vvec_1,\vvec_2,\vvec_3,\vvec_4}\text{.}\)
Are the three vectors you are left with linearly independent? If not, express one of the vectors as a linear combination of the others and find a set of two vectors whose span is the same as \(\laspan{\vvec_1,\vvec_2,\vvec_3,\vvec_4}\text{.}\)
Give a geometric description of \(\laspan{\vvec_1,\vvec_2,\vvec_3,\vvec_4}\) in \(\real^3\) as we did in Section 2.3.