Welcome to our community

Be a part of something great, join today!

Prove that this matrix equation has no roots

Fernando Revilla

Well-known member
MHB Math Helper
Jan 29, 2012
661
I quote an unsolved problem from another forum.

Could you explain to me how to solve "more sophisticated" matrix equations such as this one?

Prove that [TEX]2x^2 + x = \begin{bmatrix} -1&5&3\\-2&1&2\\0&-4&-3\end{bmatrix}[/TEX] has no solutions in [TEX]M(3,3;R)[/TEX], where [TEX]M(3,3;R) [/TEX] is the space of all matrices 3x3 with real entries.
The characteristic polynomial of the given matrix $M$ is $\chi (\lambda)=-\lambda^3-3\lambda^2-17\lambda-11$. The derivative $\chi'(\lambda)=-2\lambda^2-6\lambda-17$ has no real roots and $\chi'(0)<0$, so $\chi'(\lambda)<0$ for all $\lambda\in\mathbb{R}$ which means that $\chi$ is strictly decreasing in $\mathbb{R}$.

On the other hand, $\chi(-1)=4>0$ and $\chi(-1/2)=-25/8<0$. According to Bolzano's theorem, $\chi$ has a root $\beta\in (-1,-1/2)$. We conclude that $\beta$ is the only real eigenvalue of $M$.

Suppose that there exists $X\in\mathbb{R}^{3\times 3}$ such that $2X^2+X=M$. Let $\alpha$ be a real eigenvalue of $X$ (there is as least one because 3 is odd), then $2\alpha^2+\alpha$ is a real eigenvalue of $2X^2+X$ and one of those $\alpha$ must verify $2\alpha^2+\alpha=\beta$.

But $f(\alpha)=2\alpha^2+\alpha-\beta$ has an absolute minimum at $\alpha=-1/4$ and $f(-1/4)=-1/8-\beta>0$ which is a contradicction. So, the given equation has no solution.
 
Last edited:

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,793
Nice!

then $2\alpha^2+\alpha$ is a real eigenvalue of $2X^2+X$ and one of those $\alpha$ must verify $2\alpha^2+\alpha=\beta$.
Suppose z is an imaginary eigenvalue of X.
Then $2z^2+z$ is an eigenvalue of $2X^2+X$.
This eigenvalue could be real, couldn't it?
 

Fernando Revilla

Well-known member
MHB Math Helper
Jan 29, 2012
661
Nice! Suppose z is an imaginary eigenvalue of X.
Then $2z^2+z$ is an eigenvalue of $2X^2+X$.
This eigenvalue could be real, couldn't it?
Yes a priori, but in such case there would be two linearly independent vectors $v_1,v_2\in\mathbb{R}^3$ such that:

$(2X^2+X)v_1=(2\alpha^2+\alpha)v_1=\beta v_1$
$(2X^2+X)v_2=(2z^2+z)v_2=\beta v_2$

Then, $\beta$ would be an eigenvalue of $M$ at least double. Contradiction.
 
Last edited: