# [SOLVED]Zero-Trace Symmetric Matrix is Orthogonally Similar to A Zero-Diagonal Matrix.

#### caffeinemachine

##### Well-known member
MHB Math Scholar
Hello MHB.

During my Mechanics of Solids course in my Mechanical Engineering curriculum I came across a certain fact about $3\times 3$ matrices.

It said that any symmetric $3\times 3$ matrix $A$ (with real entries) whose trace is zero is orthogonally similar to a matrix $B$ which has only zeroes on the diagonal.

In other words, given a symmetric matrix $A$ with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I think the above fact should be true not only for $3\times 3$ matrices but for matrices with any dimension.

So what I am trying to prove is that:

Problem: Given a symmetric $n\times n$ matrix with real entries with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I have tried to attack the problem using the spectral theorem.
Since $A$ is symmetric, we know that there exists an orthogonal matrix $S$ such that $D=SAS^{-1}$ is a diagonal matrix.
We need to show that $D$ is orthogonally similar to a matrix with only zeroes on the diagonal.
Thus we have to find an orthogonal matrix $Q$ such that $QDQ^{-1}$ has only zeroes on the diagonal.
This is equivalent to show that $\sum_{k=1}^n q^2_{ik}d_k=0$ for all $i\in \{1,\ldots,n \}$, where $q_{ij}$ is the $i,j$-th entry of $Q$ and $d_k$ is the $k$-th diagonal entry of $D$.
We also know that $d_1+\ldots+d_n=0$.
Here I am stuck.
From the above it can be seen that proposition is true for $n=2$. From $n=3$ I have taken the fact from the book but cannot easily prove it. Can anybody help.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Hello MHB.

During my Mechanics of Solids course in my Mechanical Engineering curriculum I came across a certain fact about $3\times 3$ matrices.

It said that any symmetric $3\times 3$ matrix $A$ (with real entries) whose trace is zero is orthogonally similar to a matrix $B$ which has only zeroes on the diagonal.

In other words, given a symmetric matrix $A$ with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I think the above fact should be true not only for $3\times 3$ matrices but for matrices with any dimension.

So what I am trying to prove is that:

Problem: Given a symmetric $n\times n$ matrix with real entries with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I have tried to attack the problem using the spectral theorem.
Since $A$ is symmetric, we know that there exists an orthogonal matrix $S$ such that $D=SAS^{-1}$ is a diagonal matrix.
We need to show that $D$ is orthogonally similar to a matrix with only zeroes on the diagonal.
Thus we have to find an orthogonal matrix $Q$ such that $QDQ^{-1}$ has only zeroes on the diagonal.
This is equivalent to show that $\sum_{k=1}^n q^2_{ik}d_k=0$ for all $i\in \{1,\ldots,n \}$, where $q_{ij}$ is the $i,j$-th entry of $Q$ and $d_k$ is the $k$-th diagonal entry of $D$.
We also know that $d_1+\ldots+d_n=0$.
Here I am stuck.
From the above it can be seen that proposition is true for $n=2$. From $n=3$ I have taken the fact from the book but cannot easily prove it. Can anybody help.
It's not true.
Counter example:
$$A = \begin{bmatrix} -1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & +1 \end{bmatrix}$$

Additionally, you need for instance that the matrix is positive semi-definite.

#### caffeinemachine

##### Well-known member
MHB Math Scholar
It's not true.
Counter example:
$$A = \begin{bmatrix} -1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & +1 \end{bmatrix}$$

Additionally, you need for instance that the matrix is positive semi-definite.
Thank you ILS for participating.

Put $$Q= \begin{bmatrix} 1/\sqrt{2} & 0 & 1/\sqrt{2}\\ 0& 1& 0\\ -1/\sqrt{2}& 0 &1/\sqrt{2} \end{bmatrix}$$

Then $$QAQ^{-1}=\begin{bmatrix} 0 & 0 & 1\\ 0 & 0 & 0\\ 1 & 0 & 0 \end{bmatrix}$$

So this doesn't serve as a counterexample. May be there is another one.

Anyway. Can you provide a proof (or hints) for the additional hypothesis of positive semi-definitivity? Does this result have a name?

#### Opalg

##### MHB Oldtimer
Staff member
Let $A = (a_{ij})$ be an $n\times n$ symmetric matrix with trace zero. The first step is to find an orthogonal transformation that converts $A$ to a matrix with a zero on the $(1,1)$-position on the diagonal. If $a_{11}=0$ there is nothing to prove. Otherwise, choose $j>1$ such that $a_{jj}$ has the opposite sign to $a_{11}$. Such a $j$ must exist because the diagonal entries sum to zero. Now let $P_\theta$ be the orthogonal matrix given by rotating the $1$ and $j$ coordinates through an angle $\theta$ and leaving all the other coordinates alone. Specifically, the $2\times2$ submatrix of $P_\theta$ consisting of rows and columns $1$ and $j$ looks like $\begin{bmatrix}\cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}$, $P_\theta$ has $1$s in all the other diagonal places, and zeros everywhere else. You can check that the $(1,1)$-element of $P_\theta AP_\theta^{-1}$ is $a_{11}\cos^2\theta - 2a_{1j}\cos\theta\sin\theta + a_{jj}\sin^2\theta$. When $\theta=0$ this is $a_{11}$. When $\theta=\pi/2$ it is $a_{jj}$ which has the opposite sign to $a_{11}$. By the intermediate value theorem there must be some value of $\theta$ for which this element is $0$. For that value of $\theta$ $$P_\theta AP_\theta^{-1} = \begin{bmatrix} 0& v \\ w & B \end{bmatrix},$$ where $v$ is a row vector, $w$ is a column vector (each with $n-1$ elements) and $B$ is a symmetric $(n-1)\times(n-1)$ matrix with trace $0$ (because $P_\theta AP_\theta^{-1}$ has the same trace as $A$).

Now proceed inductively. By the same process as above, you can successively find orthogonal transformations that convert $A$ to a matrix with increasingly many zeros down the diagonal. At the end, you will find that the final two diagonal elements $a_{(n-1)(n-1)}$ and $a_{nn}$ are negatives of each other and you can find an orthogonal transformation converting both of them to $0$.

Last edited:

#### Klaas van Aarsen

##### MHB Seeker
Staff member
So this doesn't serve as a counterexample. May be there is another one.
Good point.
I was mixing it up with diagonalizability, which is not the point here.

Anyway, Opalg has already given a proof.

Anyway. Can you provide a proof (or hints) for the additional hypothesis of positive semi-definitivity? Does this result have a name?
The additional condition of positive semi-definitivity means that all eigenvalues are non-negative.
Since the trace is the sum of the eigenvalues, it follows that a trace of zero implies that all eigenvalues are 0.
Still, this turns out to be irrelevant for your problem.

#### caffeinemachine

##### Well-known member
MHB Math Scholar
Let $A = (a_{ij})$ be an $n\times n$ symmetric matrix with trace zero. The first step is to find an orthogonal transformation that converts $A$ to a matrix with a zero on the $(1,1)$-position on the diagonal. If $a_{11}=0$ there is nothing to prove. Otherwise, choose $j>1$ such that $a_{jj}$ has the opposite sign to $a_{11}$. Such a $j$ must exist because the diagonal entries sum to zero. Now let $P_\theta$ be the orthogonal matrix given by rotating the $1$ and $j$ coordinates through an angle $\theta$ and leaving all the other coordinates alone. Specifically, the $2\times2$ submatrix of $P_\theta$ consisting of rows and columns $1$ and $j$ looks like $\begin{bmatrix}\cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}$, $P_\theta$ has $1$s in all the other diagonal places, and zeros everywhere else. You can check that the $(1,1)$-element of $P_\theta AP_\theta^{-1}$ is $a_{11}\cos^2\theta - 2a_{1j}\cos\theta\sin\theta + a_{jj}\sin^2\theta$. When $\theta=0$ this is $a_{11}$. When $\theta=\pi/2$ it is $a_{jj}$ which has the opposite sign to $a_{11}$. By the intermediate value theorem there must be some value of $\theta$ for which this element is $0$. For that value of $\theta$ $$P_\theta AP_\theta^{-1} = \begin{bmatrix} 0& v \\ w & B \end{bmatrix},$$ where $v$ is a row vector, $w$ is a column vector (each with $n-1$ elements) and $B$ is a symmetric $(n-1)\times(n-1)$ matrix with trace $0$ (because $P_\theta AP_\theta^{-1}$ has the same trace as $A$).

Now proceed inductively. By the same process as above, you can successively find orthogonal transformations that convert $A$ to a matrix with increasingly many zeros down the diagonal. At the end, you will find that the final two diagonal elements $a_{(n-1)(n-1)}$ and $a_{nn}$ are negatives of each other and you can find an orthogonal transformation converting both of them to $0$.
Thank You!