# Linear independence

#### Petrus

##### Well-known member
Hello MHB,
I got one question. If we got this vector $$\displaystyle V=(3,a,1)$$,$$\displaystyle U=(a,3,2)$$ and $$\displaystyle W=(4,a,2)$$ why is it linear independence if determinant is not equal to zero? (I am not interested to solve the problem, I just wanna know why it is)

Regards,
$$\displaystyle |\pi\rangle$$

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Hello MHB,
I got one question. If we got this vector $$\displaystyle V=(3,a,1)$$,$$\displaystyle U=(a,3,2)$$ and $$\displaystyle W=(4,a,2)$$ why is it linear independence if determinant is not equal to zero? (I am not interested to solve the problem, I just wanna know why it is)

Regards,
$$\displaystyle |\pi\rangle$$
If you put your vectors in a matrix, you get a linear function identified by the matrix.
If you can "reach" all of $\mathbb R^3$ with this function, your vectors are linearly independent.
The determinant shows if this is possible. Zero means it's not.

#### Petrus

##### Well-known member
If you put your vectors in a matrix, you get a linear function identified by the matrix.
If you can "reach" all of $\mathbb R^3$ with this function, your vectors are linearly independent.
The determinant shows if this is possible. Zero means it's not.
Thanks! I start to understand now! Regards,
$$\displaystyle |\pi\rangle$$

#### Petrus

##### Well-known member
Is this state true.
1. We can check if a vector is linear Independence by checking if the determinant is not equal to zero
2. For a linear independence matrix there exist a inverse
3.
I did find this theorem in internet that don't say in my book.
"If amounts of vector $$\displaystyle v_1,v_2,v_3....v_p$$ is in a dimension $$\displaystyle \mathbb R^n$$
then it is linear dependen if $$\displaystyle p>n$$

Regards,
$$\displaystyle |\pi\rangle$$

Last edited:

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Is this state true.
1. We can check if a vector is linear Independence by checking if the determinant is not equal to zero
2. For a linear independence matrix there exist a inverse
3.
I did find this theorem in internet that don't say in my book.
"If amounts of vector $$\displaystyle v_1,v_2,v_3....v_p$$ is in a dimension $$\displaystyle \mathbb R^n$$
then it is linear dependen if $$\displaystyle p>n$$

Regards,
$$\displaystyle |\pi\rangle$$
You can only calculate the determinant of a square matrix.
That means you can only use a determinant to check independence of n vectors, each of dimension n.

An inverse can only exist for a square matrix.
If the matrix is square and the vectors in it are linearly independent, then there exists an inverse.

If you have n linearly independent vectors, they span an n-dimensional space, like $\mathbb R^n$.
If you have one more vector, that won't fit in that n-dimensional space anymore in an independent manner.
So a set of n+1 vectors in $\mathbb R^n$ will have to be dependent.

#### Petrus

##### Well-known member
You can only calculate the determinant of a square matrix.
That means you can only use a determinant to check independence of n vectors, each of dimension n.

An inverse can only exist for a square matrix.
If the matrix is square and the vectors in it are linearly independent, then there exists an inverse.

If you have n linearly independent vectors, they span an n-dimensional space, like $\mathbb R^n$.
If you have one more vector, that won't fit in that n-dimensional space anymore in an independent manner.
So a set of n+1 vectors in $\mathbb R^n$ will have to be dependent.
Thanks, I meant a square matrix Regards,
$$\displaystyle |\pi\rangle$$