Eigenvalues and Eigenvectors over a Polynomial Ring

In summary: Thanks very much for the explanation. I think I understand what caffeinemachine was getting at now. I had no idea that there was a different way of defining $P(f)$. I guess my understanding of abstract algebra is not as strong as I thought it was.Thanks again for the help. :)In summary, the conversation discussed the problem of proving that an eigenvector of a linear transformation $f:V\rightarrow V$ is also an eigenvector of $P(f)$ for a polynomial $P(t)\in F[t]$. The solution involved using the matrix representation of $f$ and $P(f)$ and showing that the eigenvalue of the eigenvector is $P(\lambda)$ for the original linear transformation
  • #1
Sudharaka
Gold Member
MHB
1,568
1
Hi everyone, :)

Here's another question that I solved. Let me know if you see any mistakes or if you have any other comments. Thanks very much. :)

Problem:

Prove that the eigenvector \(v\) of \(f:V\rightarrow V\) over a field \(F\), with eigenvalue \(\lambda\), is an eigenvector of \(P(f)\) where \(P(t)\in F[t]\). What is the eigenvalue of \(v\) with respect to \(P(f)\)?

My Solution:

Let \(A\) be the matrix representation of the linear transformation \(f\). Then we can write, \(Av=\lambda v\). Now let, \(P(t)=a_0+a_1 t+a_2 t^2+\cdots+a_n t^n\) where \(a_i\in F\,\forall\,i\). Then,

\[P(A)=a_0+a_1 A+a_2 A^2+\cdots+a_n A^n\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 (Av)+a_2 (A^2 v)+\cdots+a_n (A^n v)\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 \lambda v+a_2 \lambda^2 v+\cdots+a_n \lambda^n v\]

\[\therefore P(A)(v)=(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n )(v)\]

Hence \(v\) is an eigenvector for \(P(f)\) and the corresponding eigenvalue is, \(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n\).
 
Physics news on Phys.org
  • #2
Sudharaka said:
Hi everyone, :)

Here's another question that I solved. Let me know if you see any mistakes or if you have any other comments. Thanks very much. :)

Problem:

Prove that the eigenvector \(v\) of \(f:V\rightarrow V\) over a field \(F\), with eigenvalue \(\lambda\), is an eigenvector of \(P(f)\) where \(P(t)\in F[t]\). What is the eigenvalue of \(v\) with respect to \(P(f)\)?

My Solution:

Let \(A\) be the matrix representation of the linear transformation \(f\). Then we can write, \(Av=\lambda v\). Now let, \(P(t)=a_0+a_1 t+a_2 t^2+\cdots+a_n t^n\) where \(a_i\in F\,\forall\,i\). Then,

Here's a conceptual error. You have written $Av=\lambda v$. Here $A$ is a matrix and $v$ is a vector. Here you should have used the column vector representation of $v$ with respect to the same basis you used to represent $f$ as the matrix $A$.

Sudharaka said:
\[P(A)=a_0+a_1 A+a_2 A^2+\cdots+a_n A^n\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 (Av)+a_2 (A^2 v)+\cdots+a_n (A^n v)\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 \lambda v+a_2 \lambda^2 v+\cdots+a_n \lambda^n v\]

\[\therefore P(A)(v)=(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n )(v)\]

Hence \(v\) is an eigenvector for \(P(f)\) and the corresponding eigenvalue is, \(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n\).
This is fine (keeping in mind the above comment) although you didn't need to use the matrix representation of $f$. You could have done what you have done with $f$ itself. The solution would be shorter too.

:)
 
  • #3
caffeinemachine said:
Here's a conceptual error. You have written $Av=\lambda v$. Here $A$ is a matrix and $v$ is a vector. Here you should have used the column vector representation of $v$ with respect to the same basis you used to represent $f$ as the matrix $A$.

This is fine (keeping in mind the above comment) although you didn't need to use the matrix representation of $f$. You could have done what you have done with $f$ itself. The solution would be shorter too.

:)

Thank you very much for the ideas. I was too lazy to write down "let \(v\) be the column vector representation of ..." and thought it was rather implied when I used the matrix representation of \(f\).

Your other idea sounds good, but for some weird reason, I always like to deal with matrices rather than the functional form of linear transformations. :)

Thanks again for your generous help. :)
 
  • #4
Sudharaka said:
... for some weird reason, I always like to deal with matrices rather than the functional form of linear transformations. :)
There is no problem if $V$ has dimension $n$ (finite), because we can use the well known isomophism of algebras $\phi:\operatorname{End}(V)\to \mathbb{K}^{n\times n},$ $\phi (f)=A=[f]_B,$ where $B$ is a fixed basis of $V.$
 
  • #5
I could be wrong, but I think what caffeinemachine was getting at is this:

Suppose $P(t) = a_0 + a_1t +\cdots + a_nt^n$.

By definition:

$P(f)$ is defined as the linear transformation:

$P(f)(v) = (a_0I + a_1f + \cdots a_nf^n)(v) = a_0v + a_1f(v) + \cdots + a_nf^n(v)$

(in other words, we use the "point-wise" sum for linear transformations, and composition for multiplication, as is usual for a ring of endomorphisms of an abelian group. The scalar multiplication is also "pointwise": $(af)(v) = a(f(v))$).

Presumably, you have already proved that if $\lambda$ is an eigenvalue for $f$ with eigenvector $v$, then (for natural numbers $k$):

$f^k(v) = \lambda^kv$

(note the exponent on the left refers to k-fold composition, and the exponent on the right refers to exponentiation in the field). If you have not done so, it's easy to prove using induction (you may wish to use the common convention that $f^0 = I = \text{id}_V$, the identity function on $V$).

Thus, for an eigenvector $v$ with eigenvalue $\lambda$, we have that $v$ is likewise an egienvector for $P(f)$ with eigenvalue $P(\lambda)$.

The advantage to this is that no mention is made of the dimensionality of the vector space $V$, and no assumptions are made about any basis.
 

1. What are eigenvalues and eigenvectors over a polynomial ring?

Eigenvalues and eigenvectors over a polynomial ring refer to the solutions of a polynomial equation in a particular ring, where the coefficients of the polynomial are themselves polynomials. In this context, eigenvalues are the roots of the polynomial equation, and eigenvectors are the corresponding non-zero solutions of the equation.

2. How are eigenvalues and eigenvectors over a polynomial ring calculated?

To calculate eigenvalues and eigenvectors over a polynomial ring, we need to first set up the polynomial equation by substituting the matrix or linear operator into the characteristic polynomial. Then, we solve the equation to determine the roots, which are the eigenvalues. Finally, we use the eigenvalues to find the corresponding eigenvectors by solving a system of linear equations.

3. What is the significance of eigenvalues and eigenvectors over a polynomial ring?

Eigenvalues and eigenvectors over a polynomial ring have many applications in mathematics and engineering. They are used to analyze the behavior of systems, such as in quantum mechanics, signal processing, and optimization problems. They also provide a way to simplify complex systems by breaking them down into smaller, more manageable parts.

4. Can we have complex eigenvalues and eigenvectors over a polynomial ring?

Yes, it is possible to have complex eigenvalues and eigenvectors over a polynomial ring. In fact, complex eigenvalues and eigenvectors often arise in systems with real coefficients, and they can provide valuable information about the behavior of the system.

5. How do eigenvalues and eigenvectors over a polynomial ring differ from those over a field?

The main difference between eigenvalues and eigenvectors over a polynomial ring and those over a field is the type of coefficients used. In a polynomial ring, the coefficients of the polynomial are themselves polynomials, while in a field, the coefficients are elements of the field. This can lead to different solutions and behaviors for the systems being analyzed.

Similar threads

  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
985
Replies
3
Views
2K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
971
  • Linear and Abstract Algebra
2
Replies
39
Views
2K
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
20
Views
3K
Back
Top