Linear Algebra -- Is this a basis?

In summary, in order to determine if a set of functions is a basis for a space, the set must be both linearly independent and span the space. For the sets given in the conversation, it is necessary to check if they are linearly independent and if they span the space of polynomials of degree < 2. In the case of set b, the matrix was set up incorrectly and contains too many functions to be linearly independent. In set d, the matrix was set up correctly and is easier to use for finding the reduced row echelon form.
  • #1
RJLiberator
Gold Member
1,095
63

Homework Statement



Determine if the following sets are bases for [itex]P_2(R)[/itex]

b) [itex](1+2x+x^2, 3+x^2,x+x^2)[/itex]
d) [itex](-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2)[/itex]

Homework Equations


Bases IF Linear Independence AND span(Set)=[itex]P_2(R)[/itex]
RREF = Reduced Row Echelon Form

The Attempt at a Solution



My first question here regards an understanding of notation.

So for each B and D I did worked it out to be in Row Echelon Form and found linear independce. Permitting I did the calculations correctly, is it safe to say that these are basis for [itex]P_2(R)[/itex] as the amount of terms in the set is 3 and since 3-1 = 2 P_2 is safe and these sets span P_2(R)?

Second question: For b) I was able to get the RREF rather easily from the matrix:
\begin{pmatrix}
1 & 2 & 1 & 0\\
1 & 0 & 3 & 0\\
1 & 1 & 0 & 0
\end{pmatrix}

This should be linear independent in RREF, and so it is a bases. An answer key says "no" this is not a bases. However, the answer key can be wrong. Is there anything I did wrong here in setting this problem up for b) or perhaps my understanding of span?Third question: For d) I was checking a solution and the solution had set up the matrix differently then what I had expected. They set up the matrix as such:
\begin{pmatrix}
-1 & 3 & -2 & 0\\
2 & -4 & -5 & 0\\
4 & -10 & -6 & 0
\end{pmatrix}
As you can see, since d) [itex](-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2)[/itex], they switched the columns and rows that I'm traditionally used to. Is this ok to do? The normal way I do it, like my example in b, is difficult to get into RREF, however this way is rather easy.
 
Last edited:
Physics news on Phys.org
  • #2
In reponse to your final question: "Is this a basis?" is correct. Basis - singular, bases - plural.
Also, the word is "grammar."
RJLiberator said:

Homework Statement



Determine if the following sets are bases for [itex]P_2(R)[/itex]

b) [itex](1+2x+x^2, 3+x^2,x+x^2)[/itex]
d) [itex](-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2)[/itex]

Homework Equations


Bases IF Linear Independence AND span(Set)=[itex]P_2(R)[/itex]
RREF = Reduced Row Echelon Form

The Attempt at a Solution



My first question here regards an understanding of notation.

So for each B and D I did worked it out to be in Row Echelon Form and found linear independce. Permitting I did the calculations correctly, is it safe to say that these are basis for [itex]P_2(R)[/itex] as the amount of terms in the set is 3 and since 3-1 = 2 P_2 is safe and these sets span P_2(R)?
It's not clear what you're asking. From the context of the problem, I take it that P2 is the space of polynomials of degree ##\le## 2. Some textbooks define this as polynomials of degree < 2.
For a set of functions to be a basis for some space, the set (1) has to be linearly independent and (2) has to span the space. You haven't said what P2 means in your book, so I can't say whether the sets are linearly independent.
RJLiberator said:
Second question: For b) I was able to get the RREF rather easily from the matrix:
\begin{pmatrix}
1 & 2 & 1 & 0\\
1 & 0 & 3 & 0\\
1 & 1 & 0 & 0
\end{pmatrix}
This isn't the way I would do it, for two reasons.
1) Each function in the set, treated as a vector, should appear as a column in the matrix, not a row. For what you did, it doesn't make much difference, as you are essentially working with a 3 x 3 matrix. If this matrix is row-reducible to the identity matrix, so will be its transpose.
2) There is no reason to have that fourth column of 0s. None of the row operations will cause it to change.
RJLiberator said:
This should be linear independent in RREF, and so it is a bases.
It is a basis.
RJLiberator said:
An answer key says "no" this is not a bases. However, the answer key can be wrong. Is there anything I did wrong here in setting this problem up for b) or perhaps my understanding of span?
Again, how is P2 defined? If it's the space of polynomials of degree < 2, then you have too many functions in your set for the set to be linearly independent.
RJLiberator said:
Third question: For d) I was checking a solution and the solution had set up the matrix differently then what I had expected. They set up the matrix as such:
\begin{pmatrix}
-1 & 3 & -2 & 0\\
2 & -4 & -5 & 0\\
4 & -10 & -6 & 0
\end{pmatrix}
This is how I would set it up, with the coefficients of the polynomials as columns, and with the elements in a column being the coefficients of 1, x, x2, in that order. I would not have the fourth column, though.
RJLiberator said:
As you can see, since d) [itex](-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2)[/itex], they switched the columns and rows that I'm traditionally used to. Is this ok to do?
Not only is the way they did it OK -- it's better. When you set up an equation for determining whether a set of vectors/functions is linearly independent, the equation looks like
$$c_1\vec{v_1} + c_2\vec{v_2} + \dots + c_n\vec{v_n} = \vec{0}$$
The vectors in this equation are column vectors. Expanding the above as a matrix product, you get the following
$$\begin{bmatrix} v_{11} & v_{21} & \dots & v_{n1} \\
v_{12} & v_{22} & \dots & v_{n2} \\
\dots & \dots & \dots & \dots \\
v_{1m} & v_{2m} & \dots & v_{nm} \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \\ \dots \\ c_n \end{bmatrix} = \vec{0}$$

The first column contains the coordinates of the first vector/polynomial, and continues in the same manner.
RJLiberator said:
The normal way I do it, like my example in b, is difficult to get into RREF, however this way is rather easy.

EDIT: I know my title is improper grammer. Is this a bases* should be the title.
 
  • Like
Likes RJLiberator
  • #3
*bow* *clap*
You may have just solved every mystery I had with these two problems for the past few days.

To summarize.
1) Grammar* :p. It is bases for plural, basis for singular.
2) I need to be able to define P_2 (R). This should mean the space of polynomials of degree [itex]≤2[/itex].
3) Each function in a set is treated like a vector. This means that it should be represented as a column. While what I did in part b worked out, it isn't exactly the 'correct' way of operating with matrices and I should fix this understanding before I move forward.
4) Eliminate the 0 column, saves time.

So based on my definition of [itex]P_2(R)[/itex] being [itex]≤2[/itex] then I can say that both b and d are bases for [itex]P_2(R)[/itex].
If the definition was actually <2, then both would not be bases for [itex]P_2(R)[/itex].
 
  • #4
RJLiberator said:
2) I need to be able to define P_2 (R). This should mean the space of polynomials of degree ≤2.
You don't need to define it -- it should be defined in your book. That said, since the polynomials include quadratics, I'm guessing that your book is defining P2 as the space of polynomials of degree ≤ 2. Since the three functions are linearly independent, and there are three of them, the minimum number for a spanning set, then the set must be a basis for P2.
 
  • Like
Likes RJLiberator
  • #5
Thank you for all of your clarification here.
 
  • #6
Mark44 said:
In reponse to your final question: "Is this a basis?" is correct. Basis - singular, bases - plural.
Also, the word is "grammar."
It's not clear what you're asking. From the context of the problem, I take it that P2 is the space of polynomials of degree ##\le## 2. Some textbooks define this as polynomials of degree < 2.
For a set of functions to be a basis for some space, the set (1) has to be linearly independent and (2) has to span the space. You haven't said what P2 means in your book, so I can't say whether the sets are linearly independent.
This isn't the way I would do it, for two reasons.
1) Each function in the set, treated as a vector, should appear as a column in the matrix, not a row. For what you did, it doesn't make much difference, as you are essentially working with a 3 x 3 matrix. If this matrix is row-reducible to the identity matrix, so will be its transpose.
2) There is no reason to have that fourth column of 0s. None of the row operations will cause it to change.

There is nothing wrong with having a row for the coefficients of a polynomial, with separate rows for different polynomials. In fact, I have seen that done in some books, and it is the way I myself would do it. That would, for example, make row operations on the matrix (to get new rows) correspond to taking linear combinations of the polynomials (to get new polynomials). It depends on whether you prefer row or column operations, but should make absolutely no theoretical or practical difference.

As to that pesky fourth column: I see it as a hindrance rather than a help; for example, it prevents taking a determinant, which is one quick way of checking linear independence.
 
  • Like
Likes RJLiberator
  • #7
Personally, I don't like changing to matrix form for problems like this. That smacks too much of "memorizing formulas" rather than actuallyunderstanding what you are doing. A set of vectors is a basis for a vector space if and only if every vector in the space can be written, in a unique way, as a linear combination of the vectors. Here, the vector space is the set of all second degree polynomials with real coefficients so any "vector" can be written in the form [itex]ax^2+ bx+ c[/itex] for real number, a, b, and c, with the usual addition of polynomials and multiplication by real numbers as the operations. So the question becomes, can we find real number, [itex]\alpha[/itex], [itex]\beta[/itex], and [itex]\gamma[/itex] such that [itex]\alpha(1+ 2x+ x^2)+ \beta(3+ x^2)+ \gamma(x+ x^2)= ax^2+ bx+ c[/itex] for any a, b, c (spans the space) and, if so, is that solution unique (is linear independent)?

Multiplying the left side out, [itex]\alpha+ 2\alpha x+ \alpha x^2+ 3\beta+ \beta x^2+ \gamma x+ \gamma x^2= ax^2+ bx+ c[/itex].
Combining like terms, [itex](\alpha + \beta+ \gamma)x^2+ (2\alpha+ \gamma)x+ (\alpha+ 3\beta)= ax^2+ bx+ c[/itex].

Since those have to be equal for all x, corresponding coefficients must be equal:
[itex]\alpha+ \beta+ \gamma= a[/itex]
[itex]2\alpha+ \gamma= b[/itex] and
[itex]\alpha+ 3\beta= c[/itex]
(That corresponding coefficients must be equal follows by, for example, letting x= 0 so the constant terms must be equal, canceling those terms, dividing both sides by x, then letting x= 0 again to show that the coefficients of x must be equal, etc. This also uses the fact that polynomials are continuous functions since "dividing by 0 and then letting x= 0" really involves a limit. This is equivalent to [itex]1[/itex], [itex]x[/itex], [itex]x^2[/itex] being a basis for this space.)

From [itex]\alpha+ \gamma= b[/itex] we get [itex]\gamma= b- \alpha[/itex]. From [itex]\alpha+ 3\beta= c[/itex] we get [itex]\beta= (c- \alpha)/3[/itex]. Putting those into [itex]\alpha+ \beta+ \gamma= a[/itex], we have [itex]\alpha+ (c- \alpha)/3+ (b- \alpha)= (-1/3)\alpha+ c/3+ b= a[/itex] so that [itex]\alpha= -3(a- b- c/3)[/itex]. That is a unique value for all a, b, c. Now go back to [itex]\beta= (c- \alpha)/3[/itex] and [itex]\gamma= b- \alpha[/itex] to determine the unique values of [itex]\beta[/itex] and [itex]\gamma[/itex].
 
  • Like
Likes RJLiberator
  • #8
Interesting perspective here.

I think you might have made one mistake.

From α+γ=b we get γ=b−α.

Shouldnt this be: From 2α+γ=b we get γ=b−2α.

so then:
α+(c−α)/3+(b−2α) = a
so 3c+b=a
 
  • #9
HallsOfIvy said:
Personally, I don't like changing to matrix form for problems like this. That smacks too much of "memorizing formulas" rather than actuallyunderstanding what you are doing.
That's a good point, and is the reason I mentioned setting up an equation with a linear combination of vectors in a previous post. Understanding why you are row-reducing a matrix is very important.
HallsOfIvy said:
A set of vectors is a basis for a vector space if and only if every vector in the space can be written, in a unique way, as a linear combination of the vectors. Here, the vector space is the set of all second degree polynomials with real coefficients so any "vector" can be written in the form [itex]ax^2+ bx+ c[/itex] for real number, a, b, and c, with the usual addition of polynomials and multiplication by real numbers as the operations. So the question becomes, can we find real number, [itex]\alpha[/itex], [itex]\beta[/itex], and [itex]\gamma[/itex] such that [itex]\alpha(1+ 2x+ x^2)+ \beta(3+ x^2)+ \gamma(x+ x^2)= ax^2+ bx+ c[/itex] for any a, b, c (spans the space) and, if so, is that solution unique (is linear independent)?
I don't have any problems with someone working with vectors in ##\mathbb{R}^3## (in this case) instead of quadratic functions, provided that they understand that these spaces are isomorphic.
 
  • Like
Likes RJLiberator
  • #10
RJLiberator said:
Interesting perspective here.

I think you might have made one mistake.
Shouldnt this be: From 2α+γ=b we get γ=b−2α.

so then:
α+(c−α)/3+(b−2α) = a
so 3c+b=a
It was bad enough when I couldn't read my own hand writing- now I can't read my own typing!
 
  • Like
Likes RJLiberator
  • #11
Edit: Ugh, nevermind, looked at it wrong.

Not sure how deep of a proof you need to do on the said matter.
The system of vectors forms a basis if it's linearly independent and spans ##P_2##.

If you show only a trivial linear combination produces ##k_1b_1 + k_2b_2 + k_3b_3 = 0## (the ##k##-s have to be 0 ##\Leftrightarrow## trivial linear combination) then the vectors are linearly independent. If we form a 3x3 matrix from these vectors then the matrix is regular and can therefore be transformed into an identity matrix via elementary row operations.

Any regular 3x3 matrix spans the said space.
 
Last edited:
  • Like
Likes RJLiberator

Related to Linear Algebra -- Is this a basis?

1. What is a basis in linear algebra?

A basis in linear algebra is a set of linearly independent vectors that span a vector space. This means that any vector in the space can be written as a unique combination of the basis vectors.

2. How do you determine if a set of vectors is a basis?

To determine if a set of vectors is a basis, you can check if the vectors are linearly independent and if they span the vector space. This can be done by setting up an augmented matrix and performing row operations to check for linear independence, and then checking if the vectors can be used to create any vector in the space.

3. Can a basis have more than one set of vectors?

Yes, a basis can have more than one set of vectors. This is because a vector space can have infinite bases, as long as the vectors in the set are linearly independent and span the space.

4. What happens if a set of vectors is not a basis?

If a set of vectors is not a basis, it means that the vectors are either linearly dependent or do not span the vector space. This can lead to inconsistencies and difficulties in solving problems involving linear algebra.

5. How is a basis useful in linear algebra?

A basis is useful in linear algebra because it provides a way to represent any vector in a vector space using a linear combination of basis vectors. This makes it easier to perform operations on vectors and solve problems involving linear algebra.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
599
  • Calculus and Beyond Homework Help
Replies
2
Views
269
  • Calculus and Beyond Homework Help
Replies
10
Views
567
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
15
Views
924
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
938
Back
Top