- #1
blockdummy
- 3
- 0
Homework Statement
Prove true or false.
If A^2+A=0 then λ=1 may not be an eigenvalue.
Homework Equations
To find the eigenvalues of A I find the solutions to det(λ-A).
The definition of an eigenvalue from my understanding, AX = λX.
A(A+I) = 0
The Attempt at a Solution
I'm unable to find the connection between the restriction of A^2+A=0 and its effects on the eigenvalues.
Since A^2 + A =0 , I've thought of some matrices where this is true.
Scenarios:
A = 0
A =\begin{pmatrix}
-1\end{pmatrix}
A = \begin{pmatrix}
-1 & 0\\
0 & -1\end{pmatrix}
A = \begin{pmatrix}
-1 & -1\\
0 & 0\end{pmatrix}
It seems that when you have -1 on the diagonal the first equation holds true and the eigenvalues are -1 or 0. With that being said, this statement appears true, but I have no idea how to find scenarios which could be counterexamples nor do I have a formal proof if the statement is indeed true.
Clarification 1: Since I've only been able to find matrices with -1 (after reduction) on the diagonal to satisfy A^2 + A = 0, it appears that the eigenvalue will be always negative because when I find the characteristic polynomial its always in the form of (λ+1)^n which gives me a negative eigenvalue.
Any help would be appreciated.
Last edited: