Proving that Columns are Linearly Dependent

In summary, the conversation discusses how to prove that the columns of a matrix A with m < n are linearly dependent. The attempt at a solution involves showing that the reduced row echelon form of the matrix will have at most m pivots, and the remaining columns without pivots can be expressed as linear combinations of the columns with pivots, thus proving that they are linearly dependent. The conversation also mentions the concept of pivots and how they relate to linear dependence.
  • #1
B18
118
0

Homework Statement


Let A be an m x n matrix with m<n. Prove that the columns of A are linearly dependent.

Homework Equations


Its obvious that for the columns to be linearly dependent they must form a determinate that is equal to 0, or if one of the column vectors can be represented by a linear combination of the other vectors.

The Attempt at a Solution


It seems like there has to be more shown to prove this statement, however this is what I have right now:
Let A be an m x n matrix, and let m < n.
Then the set of n column vectors of A are in Rm and must be linearly dependent.

Is this it? or do I need to state a theorem in here somewhere?
 
Physics news on Phys.org
  • #2
B18 said:

Homework Statement


Let A be an m x n matrix with m<n. Prove that the columns of A are linearly dependent.

Homework Equations


Its obvious that for the columns to be linearly dependent they must form a determinate that is equal to 0, or if one of the column vectors can be represented by a linear combination of the other vectors.

The Attempt at a Solution


It seems like there has to be more shown to prove this statement, however this is what I have right now:
Let A be an m x n matrix, and let m < n.
Then the set of n column vectors of A are in Rm and must be linearly dependent.

Is this it? or do I need to state a theorem in here somewhere?

Hint: what is the maximum dimensionality of the space spanned by the columns (regarded as column vectors)?
 
  • #3
The maximum dimension of the space spanned would have to be m+1, correct? For example if the vectors were from R3 we would need 4 column vectors so that they were linearly dependent.
 
Last edited:
  • #4
Show that the reduced row echelon form of the mxn matrix will have at most m pivots. Then there are n-m columns without pivots, which can all be expressed as linear combinations of the columns with pivots. Since they are linear combinations of other columns, they are linearly dependent.
 
  • #5
It would have been helpful if our professor explained what pivots were. Thanks though Izzy I'll make sense of what you explained and go from there.
 
  • #6
B18 said:
The maximum dimension of the space spanned would have to be m+1, correct? For example if the vectors were from R3 we would need 4 column vectors so that they were linearly dependent.

Are you telling me that you think 4 or more 3-component vectors can possibly span a space of dimension 4?
 

Related to Proving that Columns are Linearly Dependent

1. How do you prove that columns are linearly dependent?

To prove that columns are linearly dependent, you can perform row operations on the matrix and check if any column can be written as a linear combination of the other columns. If this is possible, then the columns are linearly dependent.

2. What is the significance of proving that columns are linearly dependent?

Proving that columns are linearly dependent is important in linear algebra because it helps us understand the relationships between the columns in a matrix. It also allows us to determine if a system of equations has a unique solution or not.

3. Can you give an example of proving columns are linearly dependent?

Sure, let's say we have a matrix with three columns: [1 2 3], [2 4 6], and [3 6 9]. By performing row operations, we can see that the third column is equal to the first column multiplied by 3. Therefore, these columns are linearly dependent.

4. What are some common methods for proving columns are linearly dependent?

Some common methods for proving columns are linearly dependent include performing row operations, calculating the determinant of the matrix, or using the rank-nullity theorem.

5. Is it possible for a matrix to have both linearly dependent and independent columns?

Yes, it is possible for a matrix to have both linearly dependent and independent columns. This can happen when a subset of columns is linearly dependent, but the entire set of columns is linearly independent.

Similar threads

  • Calculus and Beyond Homework Help
Replies
15
Views
923
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
484
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
490
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
Back
Top