StatLect
Index > Matrix algebra

Linear independence

Linear independence is one of the central concepts of linear algebra. Two or more vectors are said to be linearly independent if none of them can be written as a linear combination of the others. On the contrary, if at least one of them can be written as a linear combination of the others, then they are said to be linearly dependent.

In the remainder of this lecture we will give a formal definition of linear independence, we will explain its meaning and we will provide some examples.

Table of Contents

Linearly dependent vectors

Let us start with a formal definition of linear dependence.

Definition Let [eq1] be n vectors. They are said to be linearly dependent if and only if there exist n scalars [eq2] such that[eq3]and at least one of the n scalars [eq4] is different from zero.

The requirement that al least one scalar be different from zero is fundamental. First of all, without this requirement the definition would be trivial: we could always choose[eq5]and obtain as a result[eq6]for any set of n vectors. Moreover, if one of the coefficients of the linear combination is different from zero (suppose, without loss of generality, it is $lpha _{1}$), then[eq7]can be written as[eq8] This means that $x_{1}$ is a linear combination of the vectors [eq9] with coefficients [eq10]. This motivates the informal definition of linear dependence we have given in the introduction above: two or more vectors are linearly dependent if at least one of them can be written as a linear combination of the others.

Note that we have assumed $lpha _{1}
eq 0$. This is without loss of generality because you can always change the order of the vectors and assign the first postion to a vector corresponding to a non-zero coefficient (by assumption there exists at least one such vector).

Example Let $x_{1}$ and $x_{2}$ be $2	imes 1$ column vectors defined as follows.[eq11]The linear combination[eq12]gives as a result the zero vector, because[eq13]As a consequence, $x_{1}$ and $x_{2}$ are linearly dependent.

It is now simple to give a definition of linear independence.

Linearly independent vectors

Definition Let [eq1] be n vectors. They are said to be linearly independent if and only if they are not linearly dependent.

It follows from this definition that, in the case of linear independence,[eq15]implies[eq5]In other words, when the vectors are linearly independent, their only linear combination that gives the zero vector as a result has all coefficients equal to zero.

Example Let $x_{1}$ and $x_{2}$ be $2	imes 1$ column vectors defined as follows.[eq17]Consider a linear combination of these two vectors with coefficients $lpha _{1}$ and $lpha _{2}$:[eq18]This is equal to[eq19]Therefore, we have that[eq20]if and only if[eq21]that is, if and only if [eq22]. As a consequence, the two vectors are linearly independent.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Define the following $2	imes 1$ vectors:[eq23]Are $A_{1}$ and $A_{2}$ linearly independent?

Solution

Consider a linear combination with coefficients $lpha _{1}$ and $lpha _{2}$:[eq24]Such a linear combination gives as a result the zero vector if and only if [eq25]that is, if and only if the two coefficients $lpha _{1}$ and $lpha _{2}$ solve the system of linear equations[eq26]This system can be solved as follows. From the second equation, we obtain[eq27]which, substituted in the first equation, gives[eq28]Thus, $lpha _{2}=0$ and $lpha _{1}=0$. Thus, the only linear combination of $A_{1}$ and $A_{2}$ giving the zero vector as a result has all coefficients equal to zero. This means that $A_{1}$ and $A_{2}$ are linearly independent.

Exercise 2

Let $A_{1}$, $A_{2}$ and $A_{3}$ be $3	imes 1$ vectors defined as follows:[eq29]Why are these vectors linearly dependent?

Solution

Notice that the vector $A_{3}$ is a scalar multiple of $A_{2}$:[eq30]or[eq31]As a consequence, a linear combination of $A_{1}$, $A_{2}$ and $A_{3}$, with coefficients $lpha _{1}=0$, $lpha _{2}=2$ and $lpha _{3}=-1$, gives as a result[eq32]Thus, there exists a linear combination of the three vectors such that the coefficients of the combination are not all equal to zero, but the result of the combination is equal to the zero vector. This means that the three vectors are linearly dependent.

Exercise 3

Let x be a real number. Define the following $2	imes 1$ vectors:[eq33]Are $A_{1}$ and $A_{2}$ linearly independent?

Solution

Take a linear combination with coefficients $lpha _{1}$ and $lpha _{2}$:[eq34]This linear combination is equal to the zero vector if and only if [eq35]that is, if and only if the two coefficients $lpha _{1}$ and $lpha _{2}$ solve the system of linear equations[eq36]A solution to this system can be found as follows. We subtract the second equation from the first and obtain[eq37]or[eq38]By substitution into the second equation, we get[eq39]or[eq40]Now, there are two possible cases. If $x
eq 0$ (first case), then $lpha _{2}=0$ and, as a consequence, $lpha _{1}=0$. Thus, in this case the only linear combination of $A_{1}$ and $A_{2}$ giving the zero vector as a result has all coefficients equal to zero. This means that $A_{1}$ and $A_{2}$ are linearly independent. If instead $x=0$ (second case), then any value of $lpha _{2}$ will satisfy the equation[eq40]Choose a number different from zero and denote it by $s$. Then, the system of linear equations will be solved by $lpha _{2}=s$ and $lpha _{1}=-2s$. Thus, in this case there are infinite linear combinations with at least one coefficient different from zero that give the zero vector as a result (a different combination for each choice of $s$). This means that $A_{1}$ and $A_{2}$ are linearly dependent.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.