# Diagonalizable transformation - Existence of basis

#### mathmari

##### Well-known member
MHB Site Helper
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$.
Ok, but shouldn't these vectors be then the first and second column of $D$? How do we get then the cosine and the sine?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
We can reduce the 4-dimensional problem to a 2-dimensional problem.
And we already know that the composition of two reflections in 2 dimensions is a rotation don't we?

#### mathmari

##### Well-known member
MHB Site Helper
We can reduce the 4-dimensional problem to a 2-dimensional problem.
And we already know that the composition of two reflections in 2 dimensions is a rotation don't we?
Ah so we consider the rotation matrix, right?

So $\sigma_v$ and $\sigma_w$ is the composition of two reflections, which is a rotation. Therefore the matrix $D$ must contain for these vectors the rotation matrix?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Ah so we consider the rotation matrix, right?

So $\sigma_v$ and $\sigma_w$ is the composition of two reflections, which is a rotation. Therefore the matrix $D$ must contain for these vectors the rotation matrix?
Yes.

Generally, the composition of 2 reflections is a rotation of double the angle between the normals of the planes of reflection.

#### mathmari

##### Well-known member
MHB Site Helper
Yes.

Generally, the composition of 2 reflections is a rotation of double the angle between the normals of the planes of reflection.
So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent?
That depends on how we set up the proof.
Perhaps we can start with the assumption that v and w are independent.
When the proof is complete, perhaps we won't have to make the distinction any more.

#### mathmari

##### Well-known member
MHB Site Helper
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$.
But here we don't have the rotation matrix, do we?

Staff member

#### mathmari

##### Well-known member
MHB Site Helper
I got stuck right now. What do we do next? How do we get the rotation matrix?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
I got stuck right now. What do we do next? How do we get the rotation matrix?
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$?

#### mathmari

##### Well-known member
MHB Site Helper
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$?
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=\sigma_w(b_3)=\sigma_w(b_4)=0$, right?
So we get the matrix $\begin{pmatrix}0 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
The product of those matrices is the zero matrix, or not?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
Nope.
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get?

#### mathmari

##### Well-known member
MHB Site Helper
Nope.
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get?
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
The product of those matrices is $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$, or not?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
Better.
But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$.
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$.

#### mathmari

##### Well-known member
MHB Site Helper
Better.
But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$.
Yes in this case we suppose that $w, b_3, b_4$ are orthogonal to $v$, right?

So is this the resulting matrix and we find an $\alpha$ to get this one?

#### mathmari

##### Well-known member
MHB Site Helper
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$.
So do we have to do the same in the case that $v$ and $w$ are not orthogonal?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not.
And of course we can take a look at the special case that $v$ and $w$ are orthogonal and see that it works out.

#### mathmari

##### Well-known member
MHB Site Helper
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not.
If $v$ and $w$ are not independent.
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w-(w\cdot v)w=(1-w\cdot v)w$, $\sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & (1-w\cdot v) & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v-(v\cdot w)v=(1-v\cdot w)v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}(1-v\cdot w) & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.

Is that correct?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
If $v$ and $w$ are not independent.
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
And we have $\sigma_v(w)=w-2(w\cdot v)v$, don't we?

#### mathmari

##### Well-known member
MHB Site Helper
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$.
So $v$ and $w$ must be independent? Or do we dosomething else in that case?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
So $v$ and $w$ must be independent? Or do we dosomething else in that case?
Not necessarily. It's just a different case.
If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.

#### mathmari

##### Well-known member
MHB Site Helper
Not necessarily. It's just a different case.
If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.
So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ?

#### Klaas van Aarsen

##### MHB Seeker
Staff member
So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ?
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.

#### mathmari

##### Well-known member
MHB Site Helper
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.
But how do we get that more general case, so that the matrix depends on $\alpha$ ?