Verifying some Concepts in Matrix/Linear Algebra

In summary: R3 that satisfy x = |y| form a vector space of R3.b. ...if S is a subspace of Rn and the dimension of S = n, then S = Rn.c. ...the dimensions of Col A and Nul A add up to the number of columns of A.d. ...if a set of p vectors spans an x-dimensional subspace C of Rn, then these vectors form a basis for C.e. ...The dimension of Nul A is the number of variables in the equation Ax
  • #1
war485
92
0

Homework Statement



Just like to verify these statements as being always true or false since I've been told subspaces is the most important concept in matrix/linear algebra.

a. set of vectors in R3 that satisfy x = |y| form a vector space of R3

b. if S is a subspace of Rn and the dimension of S = n, then S = Rn

c. dimensions of Col A and Nul A add up to the number of columns of A

d. if a set of p vectors spans an x-dimensional subspace C of Rn, then these vectors form a basis for C

e. The dimension of Nul A is the number of variables in the equation Ax = 0

f. if C is an x-dimensional subspace of Rn, then a linearly independent set of x vectors in C is a basis for C

The Attempt at a Solution



a. this is not always true because the vectors might not contain the zero vector (not going through the origin), which is required for a vector subspace.

b. always true since S defines the basis so dimension of each vector in subspace is n

c. false, I know that Rank of matrix + nulity A = # columns of A
so number of columns = dimension of col(A) - dimension of nulity of A

d. think this is false, because span is linearly dependent, but a basis needs to have linearly independent vectors.

e. I think this one is always true by definition.

f. This one seems tricky, so I looked up some definitions. Since there's a subspace, I can pick out any linearly independent vectors from that and so it should always be a basis for C. So i think this is true.

I desperately want to understand this stuff fully with the right reasons.
Any help is greatly appreciated.
 
Physics news on Phys.org
  • #2
war485 said:

Homework Statement



Just like to verify these statements as being always true or false since I've been told subspaces is the most important concept in matrix/linear algebra.

a. set of vectors in R3 that satisfy x = |y| form a vector space of R3

b. if S is a subspace of Rn and the dimension of S = n, then S = Rn

c. dimensions of Col A and Nul A add up to the number of columns of A

d. if a set of p vectors spans an x-dimensional subspace C of Rn, then these vectors form a basis for C

e. The dimension of Nul A is the number of variables in the equation Ax = 0

f. if C is an x-dimensional subspace of Rn, then a linearly independent set of x vectors in C is a basis for C

The Attempt at a Solution



a. this is not always true because the vectors might not contain the zero vector (not going through the origin), which is required for a vector subspace.
It is correct that this is not a subspace, but not for that reason. |0|= 0 so this does in fact contain (0,0). Think instead about additive inverses.

b. always true since S defines the basis so dimension of each vector in subspace is n
Yes, this is true but your reason doesn't makes sense to me. A vector space or subspace has a dimension but a single vector does not have a dimension so I don't know what you mean by "dimension of each vector". I think what you mean is that saying that S has dimension n means it has a basis containing n vectors. But since Rn has dimension n, that set is also spans all of Rn.

c. false, I know that Rank of matrix + nulity A = # columns of A
so number of columns = dimension of col(A) - dimension of nulity of A
But isn't the dimension of the column space of A the "rank"?

d. think this is false, because span is linearly dependent, but a basis needs to have linearly independent vectors.
No, a spanning set is not necessarily linearly dependent- but it is also not necessarily linearly independent. It is POSSIBLE that a spanning set is a basis but not necessarily so. A basis for a vector space satisfies three properties- any two of which imply the third. What are they?

e. I think this one is always true by definition.
Did you understand the question correctly? If Av= x+ y+ z= 0, how many variables are there in the equation? What is its nullity?
The nullity of a linear transformation is the dimension of its kernel- the dimension of the space of all solutions to that equation.

f. This one seems tricky, so I looked up some definitions.
Always a good idea!
Since there's a subspace, I can pick out any linearly independent vectors from that and so it should always be a basis for C. So i think this is true.
"Pick out any linearly independent vectors" from what? It is not true that any linearly independent set of vectors from C is a basis for C and there is no point in picking out linearly independent vectors from the subset because you are already given that they are all independent.
Again, a basis for a vector space satisfies three properties- any two of which imply the third.

I desperately want to understand this stuff fully with the right reasons.
Any help is greatly appreciated.
 
Last edited by a moderator:
  • #3
Thanks for an early reply, I'm going to try this again:

a.
HallsofIvy said:
It is correct that this is not a subspace, but not for that reason. |0|= 0 so this does in fact contain (0,0). Think instead about additive inverses.
Ok additive inverses could make it go through the origin. So it's not a subspace of R3 because it's only defined for x and y and not z, meaning that there are infinite ways to write a vector as a linear combination of the basis vectors. Right?

b.
HallsofIvy said:
Yes, this is true but your reason doesn't makes sense to me. A vector space or subspace has a dimension but a single vector does not have a dimension so I don't know what you mean by "dimension of each vector". I think what you mean is that saying that S has dimension n means it has a basis containing n vectors. But since Rn has dimension n, that set is also spans all of Rn.
Yes that's what I meant, I said it the wrong way. Thanks for correcting me.

c.
HallsofIvy said:
But isn't the dimension of the column space of A the "rank"?
wow, I didn't see that coming. Yes, dimension of column space = rank . So it should definitely be true. Thanks for catching me there.

d.
HallsofIvy said:
No, a spanning set is not necessarily linearly dependent- but it is also not necessarily linearly independent. It is POSSIBLE that a spanning set is a basis but not necessarily so. A basis for a vector space satisfies three properties- any two of which imply the third. What are they?
A basis for a subspace C is a set of vectors such that they need to span a subspace for C and must be linearly independent. From here, I think I'm still lost for this one.

e.
HallsofIvy said:
Did you understand the question correctly? If Av= x+ y+ z= 0, how many variables are there in the equation? What is its nullity? The nullity of a linear transformation is the dimension of its kernel- the dimension of the space of all solutions to that equation.
not sure what you mean by its kernel. So the nullity should be the dimension of the null space, which has nothing to do with the number of variables.

f.
HallsofIvy said:
"Pick out any linearly independent vectors" from what? It is not true that any linearly independent set of vectors from C is a basis for C and there is no point in picking out linearly independent vectors from the subset because you are already given that they are all independent.
Again, a basis for a vector space satisfies three properties- any two of which imply the third.
Ah... so the reason for this one similar to statement d. So it should be true since a basis in a subspace is a set of vectors that span in that subspace and they are linearly independent! I think I understand this one.

Trying to drill these deep in my head.
 
  • #4
war485 said:
Thanks for an early reply, I'm going to try this again:

a.
Ok additive inverses could make it go through the origin. So it's not a subspace of R3 because it's only defined for x and y and not z, meaning that there are infinite ways to write a vector as a linear combination of the basis vectors. Right?
That doesn't make any sense at all! My point was that, since |1|= 1, (1, 1) is in this subset but its additive inverse, (-1, -1) does not satisfy |y|= x and so is not,.

b.
Yes that's what I meant, I said it the wrong way. Thanks for correcting me.

c. wow, I didn't see that coming. Yes, dimension of column space = rank . So it should definitely be true. Thanks for catching me there.

d.
A basis for a subspace C is a set of vectors such that they need to span a subspace for C and must be linearly independent. From here, I think I'm still lost for this one.
The original question was "if a set of p vectors spans an x-dimensional subspace C of Rn, then these vectors form a basis for C"
A "basis" for a vector space of dimension n satisfies three properties: They span the space, they are independent, and there are n vectors in the basis. Knowing that a set of p vectors spans an x dimensional subspace tells you that they are independent only if p= x. A set of less than x vectors cannot span an x-dimensional space and a set of more than x vectors cannot be independent.

e. not sure what you mean by its kernel. So the nullity should be the dimension of the null space, which has nothing to do with the number of variables.
"Kernel" is another name for "null space", the set of vectors v such that Av= 0. The nullity is the dimension of the null space. I won't say that it has "nothing" to do with the number of variables but it is not equal to that. For example if A maps (x, y, z) to (x+ z, y- z, y- z) then (x, y, z) is mapped to (0, 0, 0) if x+z= 0, y- z= 0, and y- z= 0. That means that x= z and y= z where z can be any number: a basis is (-1, 1, 1) since any vector is of the form (-z, z, z)= z(-1, 1, 1). The null space has dimension 1 and the nullity is 1.

f.
Ah... so the reason for this one similar to statement d. So it should be true since a basis in a subspace is a set of vectors that span in that subspace and they are linearly independent! I think I understand this one.

Trying to drill these deep in my head.
 
  • #5
HallsofIvy said:
That doesn't make any sense at all! My point was that, since |1|= 1, (1, 1) is in this subset but its additive inverse, (-1, -1) does not satisfy |y|= x and so is not
o oups...

HallsofIvy said:
The original question was "if a set of p vectors spans an x-dimensional subspace C of Rn, then these vectors form a basis for C"
A "basis" for a vector space of dimension n satisfies three properties: They span the space, they are independent, and there are n vectors in the basis. Knowing that a set of p vectors spans an x dimensional subspace tells you that they are independent only if p= x. A set of less than x vectors cannot span an x-dimensional space and a set of more than x vectors cannot be independent.
"Kernel" is another name for "null space", the set of vectors v such that Av= 0. The nullity is the dimension of the null space. I won't say that it has "nothing" to do with the number of variables but it is not equal to that. For example if A maps (x, y, z) to (x+ z, y- z, y- z) then (x, y, z) is mapped to (0, 0, 0) if x+z= 0, y- z= 0, and y- z= 0. That means that x= z and y= z where z can be any number: a basis is (-1, 1, 1) since any vector is of the form (-z, z, z)= z(-1, 1, 1). The null space has dimension 1 and the nullity is 1.
Thanks for your input and your insight and your help! I'll keep on practicing this stuff this weekend.
 
  • #6
recapping just to make sure...

d. if a set of p vectors spans an x-dimensional subspace C of Rn, then these vectors form a basis for C

true only if p = x , otherwise it will become dependent vectors, thus are not independent, thus cannot form a basis.


f. if C is an x-dimensional subspace of Rn, then a linearly independent set of x vectors in C is a basis for C

You said it was a false statement and I'm still thinking it is true. If the set of vectors are independent, they span the space and there are x of them (same as dimensions) so shouldn't it be true?

*edit* nevermind this last post, figured it out. Dick helped too :) Thanks hallofivy and Dick.
 
Last edited:

Related to Verifying some Concepts in Matrix/Linear Algebra

1. What is the purpose of verifying concepts in matrix/linear algebra?

The purpose of verifying concepts in matrix/linear algebra is to ensure that mathematical statements and operations involving matrices and linear equations are correct and consistent. This involves using various techniques and methods to check the validity of mathematical concepts and their applications in solving real-world problems.

2. How do you verify if a matrix is invertible?

A matrix is invertible if its determinant is non-zero. This means that the matrix is non-singular and has a unique solution. To verify this, you can calculate the determinant of the matrix and if it is non-zero, then the matrix is invertible.

3. What is the difference between a row vector and a column vector?

A row vector is a matrix with only one row, while a column vector is a matrix with only one column. The main difference is in their dimensions, with a row vector having a dimension of 1 x n and a column vector having a dimension of n x 1. In terms of representation, a row vector is usually denoted as a horizontal array of numbers, while a column vector is denoted as a vertical array of numbers.

4. How do you verify if two matrices are equal?

To verify if two matrices are equal, you need to check if they have the same dimensions and if each corresponding element is equal. This means that the elements in the same position in both matrices should have the same value. If these conditions are met, then the matrices are equal.

5. What is the significance of matrix multiplication in linear algebra?

Matrix multiplication is a fundamental operation in linear algebra that is used to represent and solve systems of linear equations. It is also used to transform geometric figures in higher dimensions and is a crucial concept in various fields such as engineering, economics, and computer science. Matrix multiplication allows for the representation and manipulation of complex relationships and systems, making it an essential tool in problem-solving and analysis.

Similar threads

  • Precalculus Mathematics Homework Help
2
Replies
57
Views
3K
  • Linear and Abstract Algebra
Replies
8
Views
980
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
987
  • Precalculus Mathematics Homework Help
Replies
12
Views
2K
  • Precalculus Mathematics Homework Help
Replies
14
Views
5K
  • Precalculus Mathematics Homework Help
Replies
17
Views
2K
  • Precalculus Mathematics Homework Help
Replies
32
Views
940
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
839
Back
Top