Row redcing a matrix to determine linear dependence?

In summary, the student is trying to determine if a set of vectors are linearly dependent or independent. They have been solving similar problems in their textbook using a matrix and row reducing it. If there is a free variable, they determine the vectors to be dependent, and if there is no free variable, they determine the vectors to be independent. However, they encountered a question on an exam where they found a free variable, but the answer was marked wrong. They are questioning if they made a mistake or if their understanding of linear dependence is incorrect. The student is also uncertain if two rows of zeroes are required for linear dependence, as their textbook only mentions one row of zeroes. Another user points out that the student's example in the original post
  • #1
lonewolf219
186
2

Homework Statement



Determine if vectors

<1,-1,0,2,3>
<1,0,-1,3,3>
<1,-1,0,3,0>
<0,1,-1,2,-2>

are linearly dependent or independent

Homework Equations



I have been solving these questions in the book using a matrix and row reducing them. If I wound up with a free variable, I determined the vectors to be linearly dependent. If there was no free variable after row reducing, then I determined the vectors were linearly independent. From the examples in the book, this method worked (unless just by chance?)

The Attempt at a Solution



This is the problem that appeared on the first exam. I solved as described, and found a free variable; the answer was marked wrong. I thought the existence of a parameter in a solution set meant that at least two of the vectors were in the same span, therefore linearly dependent. Am I mistaken?
 
Physics news on Phys.org
  • #2
lonewolf219 said:

Homework Statement



Determine if vectors

<1,-1,0,2,3>
<1,0,-1,3,3>
<1,-1,0,3,0>
<0,1,-1,2,-2>

are linearly dependent or independent


Homework Equations



I have been solving these questions in the book using a matrix and row reducing them. If I wound up with a free variable, I determined the vectors to be linearly dependent. If there was no free variable after row reducing, then I determined the vectors were linearly independent. From the examples in the book, this method worked (unless just by chance?)


The Attempt at a Solution



This is the problem that appeared on the first exam. I solved as described, and found a free variable; the answer was marked wrong. I thought the existence of a parameter in a solution set meant that at least two of the vectors were in the same span, therefore linearly dependent. Am I mistaken?

Did you write your vectors as columns in the matrix you row reduced? If so, you will get at least one row of zeroes in the reduced matrix. If there was only one row, the vectors are linearly independent. If there were two or more rows of zeroes, then the vectors are linearly dependent.

If we let v1, v2, v3, and v4 represent your four vectors of a matrix A, row reducing A is equivalent to solving the equation c1v1 + c2v2 + c3v3 + c4v4 = 0. If the row-reduced matrix has only a single row of zeroes, then the constants are equal to zero, so the vectors are linearly independent. If the row-reduced matrix has two or more rows of zeroes, then at least one of the constants is nonzero, so the vectors are linearly dependent.
 
  • #3
Thanks Mark44 for the reply. I did express the vectors as columns. But in the book, there is no part that says you must have two rows of zeroes for dependence.

Example: (assume these are columns)

<1,2,3>
<4,5,6>
<2,1,0>

Determine if the set is linearly dependent.

Solution:

"Clearly, x1 and x2 are basic variables, and x3 is free. Each nonzero value of x3 determines a nontrivial solution of (1). Hence, vector set is linearly dependent (and not linearly independent)."

Am I reading this wrong?
 
  • #4
If you set your vectors as rows, as you wrote them in the opening post, then Gaussian elimination will show whether they are dependent or independent in the expected way.
 
  • #5
Thanks Joffan, haven't heard that before (undergrad)...

Mark44, appreciate you specifically saying two rows are needed...you are of course correct... and I think I need a different book
 
  • #6
lonewolf219,
In your example in post #3, you have a 3 x 3 matrix. If you row reduce and find a row of zeroes, the vectors are linearly dependent, for reasons I gave earlier.

In your original example in post #1, the matrix whose columns are those vectors will be a 5 x 4 matrix (i.e., 5 rows and 4 columns). Such a matrix has to row reduce to have at least 1 row of zeroes. If there is just 1 zero row, the vectors are lin. independent. If there are 2 or more zero rows, the vectors are lin. dependent.
 
  • #7
Yes, I see the difference... indeed, the example the book gives is correct. Thanks for pointing that out.

Thanks for your help, Mark 44!
 

Related to Row redcing a matrix to determine linear dependence?

1. What is a matrix?

A matrix is a rectangular array of numbers or symbols arranged in rows and columns. It is commonly used in mathematics, physics, and engineering to represent data and perform operations.

2. What is row reduction?

Row reduction is a method used to simplify a matrix by performing a series of elementary row operations, such as multiplying a row by a constant or adding one row to another. The goal of row reduction is to transform a matrix into a simpler form, making it easier to solve or analyze.

3. How is linear dependence determined through row reduction?

Linear dependence is determined by examining the row echelon form of a matrix after performing row reduction. If there is a row of zeros in the reduced form, the corresponding variables in the original matrix are linearly dependent. If there are no rows of zeros, the variables are linearly independent.

4. What is the significance of determining linear dependence in a matrix?

Determining linear dependence in a matrix is important because it helps identify relationships between variables in a system of equations. If a matrix has linearly dependent variables, it means that some of the equations in the system are redundant and can be eliminated, simplifying the problem and making it easier to solve.

5. Are there any limitations to row reduction for determining linear dependence?

Yes, row reduction can only be used to determine linear dependence in finite-dimensional vector spaces. It cannot be applied to infinite-dimensional vector spaces, such as function spaces. Additionally, row reduction may not always provide a unique solution and may require further analysis to fully understand the linear dependence of a matrix.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
806
  • Calculus and Beyond Homework Help
Replies
1
Views
658
  • Linear and Abstract Algebra
Replies
8
Views
965
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
6K
  • Linear and Abstract Algebra
Replies
6
Views
970
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
Back
Top