Finding Linear Transformations in Polynomial Vector Spaces

In summary, the author tries to find the matrix representing a linear transformation from vector space U with basis Bu to vector space V with basis Bv. He uses 3x3 matrices and row reduction for B' to show that they are both bases, then finds the matrices of D with respect to the bases. The attempt at a solution is confusing and my mind is doing many laps around the same circuit. Any clarification would be great.
  • #1
dim&dimmer
22
0
[/b]1. Homework Statement [/b]
Let P_2 be the set of all real polynomials of degree no greater than 2.
Show that both B:={1, t, t^2} and B':= {1, 1-t, 1-t-t^2} are bases for P_2.

If we regard a polynomial p as defining a function R --> R, x |--> p(x), then p is differentiable, and
D: P_2 --> P_2, p |--> p' = dp/dx - defines a linear transformation.
Find the matrix of D with respect to the bases
(i) B in both the domain and co-domain
(ii) B in the domain and B' in the codomain.
(iii) B' in the domain and B in the codomain
(iv) B' in both the domain and codomain.

[/b]2. The attempt at a solution[/b]
I used 3x3 matrices and row reduction for B' to show that they are both bases, i.e. 3 pivot variables in reduced row form. B was just the identity matrix.
Its the second part that I don't understand. For me, the domains of B and B' are all real numbers, or have I already found the matrices of D in both domains by showing with row echelon matrices that B and B' are bases.
This is very confusing and my mind is now doing many laps around the same circuit.
Any clarification would be great.
 
Physics news on Phys.org
  • #2
Suppose T is a Linear transformation from vector space U with basis Bu to vector space V with basis Bv. To find the matrix representing T using basis Bu in the domain and basis Bv in the codomain, do the following:

1) Apply T to the first vector in basis Bu. The result, of course, will be in V and can be written as a linear combination of the vectors in basis Bv. The coefficients in that linear combination are the numbers in the first column of the matrix.

2) Apply T to the second vector in basis Bu. The result, of course, will be in V and can be written as a linear combination of the vectors in basis Bv. The coefficients in that linear combination are the numbers in the second column of the matrix.

Continue applying T to each basis vector in turn to get each column of the matrix.

For example, the first vector in B is "1". Differentiating that gives 0. That, written in the B basis is (0)1+ (0)x+ (0)x2. The first column consists entirely of 0's. The second vector in B is "x". Differentiating that gives 1. That, written in the B basis is (1)(1)+ (0)x+ (0)x2. The second column is <1 0 0>. The third vector in B is "x2". Differentiating that gives 2x. That, written in the B basis is (0)1+ (2)x+ (0)x2. The third column is <0 2 0>. (i) is
[tex]\left[\begin{array}{ccc}0 & 1 & 0 \\ 0 & 0 & 2 \\ 0 & 0 & 0\end{array}\right][/tex]

Notice that matrix has determinant 0 and is not invertible. That is because differentiation is not one-to-one and so is not invertible.
 
  • #3
so...
B matrix is [tex]
\left[\begin{array}{ccc}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{array}\right]
[/tex]
is the domain?
codomain of B is
[tex]
\left[\begin{array}{ccc}0 & 1 & 0 \\ 0 & 0 & 2 \\ 0 & 0 & 0\end{array}\right]
[/tex]
as you said
B' domain is
[tex]
\left[\begin{array}{ccc}1 & 1 & 1 \\ 0 & -1 & -1 \\ 0 & 0 & -1\end{array}\right]
[/tex]
codomain of B'
[tex]
\left[\begin{array}{ccc}0 & -1 & -1 \\ 0 & 0 & -2 \\ 0 & 0 & 0\end{array}\right]
[/tex]
Are these ok??

To find the matrices D with respect to the 4 part question, is it just a matter of solving the augmented matrices formed in each part, so that part i) would be the codomain matrix of B and part ii)the codomain matrix of B', given the identity natrix is the domain of B?

Why this is scary is because of the magnitude of how wrong I could be lol.
 
Last edited:

Related to Finding Linear Transformations in Polynomial Vector Spaces

1. What is a polynomial vector space?

A polynomial vector space is a mathematical concept that describes a set of polynomials, or algebraic expressions with multiple terms, that can be added, subtracted, and multiplied by a scalar value. In other words, it is a set of polynomials that follow certain rules and properties of vector spaces, such as closure, associativity, and distributivity.

2. What are the basic operations in a polynomial vector space?

The basic operations in a polynomial vector space are addition, subtraction, and scalar multiplication. Addition involves combining two polynomials by adding their like terms, while subtraction involves subtracting the coefficients of like terms. Scalar multiplication involves multiplying each term in a polynomial by a constant value.

3. How is the dimension of a polynomial vector space determined?

The dimension of a polynomial vector space is determined by the number of independent polynomials that form a basis for the space. This means that the dimension is equal to the number of different types of monomials in a polynomial, or the highest degree of a polynomial in the space.

4. Can a polynomial vector space have an infinite dimension?

Yes, a polynomial vector space can have an infinite dimension. This occurs when there is an infinite number of independent polynomials that form a basis for the space, such as in the case of a polynomial space with infinitely many monomials.

5. How is a polynomial vector space used in real-world applications?

A polynomial vector space has various real-world applications in fields such as physics, engineering, and computer science. It is used to model and solve problems involving polynomial equations, such as motion, optimization, and data analysis. It is also used in computer graphics and image processing to represent and manipulate curves and surfaces.

Similar threads

  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
484
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
24
Views
979
  • Calculus and Beyond Homework Help
Replies
1
Views
374
  • Calculus and Beyond Homework Help
Replies
0
Views
490
  • Calculus and Beyond Homework Help
Replies
5
Views
617
Back
Top