Welcome to our community

Be a part of something great, join today!

Diagonalizable transformation - Existence of basis

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Hey!! :giggle:

Let $1\leq n\in \mathbb{N}$ and for $x=\begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_n\end{pmatrix}, \ x=\begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_n\end{pmatrix}\in \mathbb{R}^n$ and let $x\cdot y=\sum_{i=1}^nx_iy_i$ the dot product of $x$ and $y$.

Let $S=\{v\in \mathbb{R}^n\mid v\cdot v=1\}$ and for $v\in S$ let $\sigma_v$ be a map defined by $\sigma_v:\mathbb{R}^n\rightarrow \mathbb{R}^n, \ x\mapsto x-2(x\cdot v)v$.

I have shown that it holds for $v\in S$ and $x,y\in \mathbb{R}^n$ that $\sigma_v(x)\cdot \sigma_v(y)=x\cdot y$.

Let $v\in S$. I have shown that $\sigma_v^2=\text{id}_{\mathbb{R}^n}$. To show that $\sigma_v$ is diagonalizable do we have to calculate the matrix of that transformation?



Let $n=4$ and $v,w\in S$. I want to show that there is $0\leq \alpha\leq 2\pi$ and an orthogonal basis $B$ of $\mathbb{R}^4$ such that the matrix $\sigma_v\circ\sigma_w$ as for the basis $B$ is of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ Could you give me a hint for that? Do we have to find the matrix of $\sigma_v\circ\sigma_w$ and calculate the images of the elements of the basis $B$ and write then write the result as a linear combination of the elements of $B$ and the result should be the matrix $D$? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
Hey mathmari !!

A matrix is diagonalizable iff the eigenvectors form a basis.
This is actually the definition of diagonalizable as it applies to transformations in general.
So it suffices if we can show that $\sigma_v$ has $n$ independent eigenvectors. 🧐

Is $v$ an eigenvector? 🤔
What about a vector perpendicular to $v$? 🤔
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
So it suffices if we can show that $\sigma_v$ has $n$ independent eigenvectors. 🧐

Is $v$ an eigenvector? 🤔
What about a vector perpendicular to $v$? 🤔

$v$ is an eigenvector if $\sigma_v(v)=\lambda v$ for some $\lambda\in \mathbb{R}$, right?
We have that $ \sigma_v(v)=v-2(v\cdot v)v =v-2v=-v=(-1)v $. That mens that $v$ is an eigenvector for the eigenvalue $\lambda=-1$.
Is that correct? :unsure:

Let $w$ be a vector perpendicular to $v$, then $w\cdot v=0$.
We have that $ \sigma_v(w)=w-2(w\cdot v)v =w$. That mens that $w$ is an eigenvector for the eigenvalue $\lambda=1$.
Is that correct? :unsure:

Now we have find two independent eigenvectors, but we need $n$. :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
All correct. (Nod)

How many independent vectors can we find that are perpendicular to $v$? 🤔
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
All correct. (Nod)

How many independent vectors can we find that are perpendicular to $v$? 🤔
Are there $n-1$ perpendicular vectors to $v$ since we are in $\mathbb{R}^n$ ? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
Are there $n-1$ perpendicular vectors to $v$ since we are in $\mathbb{R}^n$ ? :unsure:
Yes. Any orthogonal basis of $\mathbb R^n$ that includes $v$, contains $n-1$ vectors that are orthogonal to $v$. 🤔
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Yes. Any orthogonal basis of $\mathbb R^n$ that includes $v$, contains $n-1$ vectors that are orthogonal to $v$. 🤔
Ahh ok!

If we want to determine the eigenvalues of $\sigma_v$ then do we do the following?
$$\sigma_v(x)=\lambda x \Rightarrow \sigma_v\left (\sigma_v(x)\right )=\sigma_v\left (\lambda x \right ) \Rightarrow \sigma_v^2(x)=\lambda \sigma_v(x) \Rightarrow x=\lambda \sigma_v(x)\Rightarrow x=\lambda \cdot \lambda x\Rightarrow x=\lambda^2 x\Rightarrow (\lambda^2-1) x=0\Rightarrow \lambda=\pm 1$$ So the eigenvalues are $-1$ and $1$. Is that corect? :unsure:

To find the dimension of the respective eigenspace do we calculate the geometric multiplicity which has to be equal to the algebraic multiplicity since $\sigma_v$ is diagonizable? Or how do we calculate the dimension in this case? :unsure:
 
Last edited:

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
Yes. That works in both cases. (Nod)

Note that we've already found $1$ eigenvector $v$ for the eigenvalue $\lambda=-1$, and $n-1$ independent eigenvectors for the eigenvalue $\lambda=1$.
Since the dimension of the space is $n$, that implies that $\lambda=-1$ has both algebraic and geometric multiplicity of $1$.
And $\lambda=1$ has both algebraic and geometric multiplicity of $n-1$.
That is, we don't need to use the argument of diagonalizability to conclude that. 🧐
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Since the dimension of the space is $n$, that implies that $\lambda=-1$ has both algebraic and geometric multiplicity of $1$.
And $\lambda=1$ has both algebraic and geometric multiplicity of $n-1$.
How do we know that the algebraic multiplicity of $\lambda=1$ is $n-1$ ? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
How do we know that the algebraic multiplicity of $\lambda=1$ is $n-1$ ?
Because an eigenvalue's geometric multiplicity cannot exceed its algebraic multiplicity. 🤔
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Because an eigenvalue's geometric multiplicity cannot exceed its algebraic multiplicity. 🤔
Ah because there are $n-1$ vectors like $w$, i.e. there are $n-1$ eigenvectors for $\lambda=1$ that means that the geometric multiplicity is $n-1$ ? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
Ah because there are $n-1$ vectors like $w$, i.e. there are $n-1$ eigenvectors for $\lambda=1$ that means that the geometric multiplicity is $n-1$ ?
Yep. (Nod)
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Ok!! :geek:


Let $n=4$ and $v,w\in S$. I want to show that there is $0\leq \alpha\leq 2\pi$ and an orthogonal basis $B$ of $\mathbb{R}^4$ such that the matrix $\sigma_v\circ\sigma_w$ as for the basis $B$ is of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ Could you give me a hint for that? Do we have to find the matrix of $\sigma_v\circ\sigma_w$ and calculate the images of the elements of the basis $B$ and write then write the result as a linear combination of the elements of $B$ and the result should be the matrix $D$? :unsure:
Could you give me a hint also for that? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
Could you give me a hint also for that?
Can we find a basis of $\mathbb R^4$ that has 2 vectors in it that are orthogonal to both $v$ and $w$? 🤔
 
Last edited:

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Can we find a basis of $\mathbb R^n$ that has 2 more vectors in it that are orthogonal to both $v$ and $w$? 🤔
Their dot product is one vector that it orthogonal to both, right? How can we find some more? I got stuck right now. :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
The dot product is a scalar and not a vector. Furthermore, the cross product is not defined in 4 dimensions. :oops:

Can't we just state that such a basis must exist?
We don't have to actually find such vectors. 🤔

Either way, we can find them by starting with v and w, and by adding each of the unit vectors until we have 4 independent vectors. After that we can use the Gramm-Scmidt orthogonalization process to find 2 vectors that are orthogonal to both v and w. 🤔
 

topsquark

Well-known member
MHB Math Helper
Aug 30, 2012
1,273
I don't mean to butt in but I would just like to say how much I am enjoying these threads. I know the material in general but I'm getting some extra details that I have missed. Both of you keep up the good work!

-Dan
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Either way, we can find them by starting with v and w, and by adding each of the unit vectors until we have 4 independent vectors. After that we can use the Gramm-Scmidt orthogonalization process to find 2 vectors that are orthogonal to both v and w. 🤔
So that means that $B$ will be then the set of $v$, $w$ and the two vectors that we get from the Gramm-Scmidt orthogonalization process? :unsure:

But just stating that such a basis exist, how can we find the form of the matrix $D$ ? I got stuck right now. :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
What do $\sigma_v$ and $\sigma_w$ look like with respect to a basis that contains v, w, and vectors orthogonal to both v and w? 🤔
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
What do $\sigma_v$ and $\sigma_w$ look like with respect to a basis that contains v, w, and vectors orthogonal to both v and w? 🤔
They are invariant, aren't they? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
They are invariant, aren't they?
The extra orthogonal vectors are indeed invariant with respect to both $\sigma_v$ and $\sigma_w$. (Nod)

So? (Wondering)
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
The extra orthogonal vectors are indeed invariant with respect to both $\sigma_v$ and $\sigma_w$. (Nod)

So? (Wondering)
That's why we get the last two columns of the matrix $D$, right? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
That's why we get the last two columns of the matrix $D$, right?
Yep.
Both $\sigma_v$ and $\sigma_w$ have a matrix with respect to that basis that have the same last two columns as $D$. 🤔
 

mathmari

Well-known member
MHB Site Helper
Apr 14, 2013
4,713
Yep.
Both $\sigma_v$ and $\sigma_w$ have a matrix with respect to that basis that have the same last two columns as $D$. 🤔
So the first two columns of $D$ correspond to the vectors $v$ and $w$ ? :unsure:
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
9,591
So the first two columns of $D$ correspond to the vectors $v$ and $w$ ?
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔
 
Last edited: