Welcome to our community

Be a part of something great, join today!

Linear Dependency

Yankel

Active member
Jan 27, 2012
398
Hello all,

I have this set of vectors:

(1,1) , (-2,-2) , (0,0) , (3,-2)

I need to say if it linearly dependent, and I need to find the maximum size of the subset of this set, which is linearly independent.

What I think, is that as long as (0,0) is there, it must be dependent. In addition, (1,1) and (-2,-2) are also dependent. Thus if I had to guess, I would say the maximum size is 2 vectors, (3,-2) and either (1,1) or (-2,-2). My question is, am I right, or am I missing something ?

Thanks !
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Hello all,

I have this set of vectors:

(1,1) , (-2,-2) , (0,0) , (3,-2)

I need to say if it linearly dependent, and I need to find the maximum size of the subset of this set, which is linearly independent.

What I think, is that as long as (0,0) is there, it must be dependent. In addition, (1,1) and (-2,-2) are also dependent. Thus if I had to guess, I would say the maximum size is 2 vectors, (3,-2) and either (1,1) or (-2,-2). My question is, am I right, or am I missing something ?

Thanks !
Hi Yankel, :)

You are correct. $(0,\,0)$ is indeed linearly dependent upon any of the remaining vectors as it could be obtained by multiplying a vector by $0$; for example $(0,\,0)=0(1,\,1)$, Also $(1,\,1)$ and $(-2,\,-2)$ are linearly dependent since, $(-2,\,-2)=-2(1,\,1)$. Then it could be shown that the sets $\{(1,\,1),\,(3,\,-2)\}$ and $\{(-2,\,-2),\,(3,\,-2)\}$ are linearly independent. In the case of $\{(1,\,1),\,(3,\,-2)\}$, let,

\[\alpha(1,\,1)+\beta(3,\,-2)=0\]

and show that both $\alpha$ and $\beta$ should be equal to zero.
 

Deveno

Well-known member
MHB Math Scholar
Feb 15, 2012
1,967
In a sense, linear dependency is a measure of "spanning redundancy". For example, adding the 0-vector never gives us any new vectors in the span, and we already can realize the 0-vector as a linear combination of any other set $\{v_1,\dots,v_k\}$ as:

$0v_1 + 0v_2 + \cdots + 0v_k$.

in the same way if:

$v_2 = cv_1$ for a non-zero $c$, then we can replace any linear combination containing $v_2$ with $cv_1$.

For example, the linear combination:

$a_1v_1 + a_2v_2 + a_3v_3 + \cdots +a_kv_k$

is equal to:

$(a_1 + ca_2)v_1 + a_3v_3 + \cdots a_kv_k$

We might have just as well eliminated $v_1$ in this case, replacing it with $\dfrac{1}{c}v_2$ in any linear combination.

This situation is a bit more complicated if we have something like:

$v_3 = b_1v_1 + b_2v_2$

as we might decide to keep $\{v_1,v_2\},\{v_1,v_3\}$ or $\{v_2,v_3\}$.

Generally speaking, the more dimensions we have in our space, the more chances we have of the linear dependency relations being "complicated". For $\text{dim}(V) > 4$ I wouldn't trust "elimination by inspection" but form a matrix from the vector-set and compute its rank.

***********

In this particular problem, you have some information to go off of straight off the bat:

The dimension of your vector space is two (because each vector has only two coordinates, and some vectors with non-zero coordinates in each position exist in your set).

So at most, two of your vectors can be linearly independent.

Since (0,0) ALWAYS makes any set you add it to linearly dependent, get rid of it.

Pick any one of the 3 remaining non-zero vectors. Now we have a linearly independent set of one vector (which spans a one-dimensional subspace of our two-dimensional vector space).

Now pick a 2nd vector...is it a scalar multiple of the first vector (that is: does it lie in the subspace generated by the first vector)? If so, get rid of it, you don't need it.

Otherwise, it is linearly independent from the first vector and you are done.

Repeat this procedure until you have exhausted the set, or obtained two linearly independent vectors (which is the maximum possible).

(If we had MORE dimensions, we would have to check for a third vector, and we would have to check our 3rd choice was not in the subspace spanned by our first two choices).