Welcome to our community

Be a part of something great, join today!

[SOLVED] Finding an Orthonormal Basis

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Hi everyone, :)

Here's a question with my answer. It's pretty simple but I just want to check whether everything is perfect. Thanks in advance. :)

Question:

Let \(f:\,\mathbb{C}^2\rightarrow\mathbb{C}^2\) be a linear transformation, \(B=\{(1,0),\, (0,1)\}\) the standard basis of \(\mathbb{C}^2\) and \(A_{f,\,B}=\begin{pmatrix}3&-i\\i&3\end{pmatrix}\). Find an orthonormal basis \(C\) of engenvectors for \(f\) and \(A_{f,\,C}\).

Answer:

The ​eigenvectors of \(A_{f,\,B}\) in terms of the standard basis are \(v_1=(1,\, 1)\mbox{ and }v_2=(1,\,-1)\). To make this basis \(\{v_1,\,v_2\}\) orthonormal we shall divide each of the eigenvectors by their magnitudes. Hence,

\[C=\left\{\left( \frac{1}{\sqrt{2}}, \, \frac{1}{\sqrt{2}} \right), \, \left(\frac{1}{\sqrt{2}} ,\, -\frac{1}{\sqrt{2}} \right) \right\}\]

Now the transformation matrix from basis \(C\) to \(B\) would be, \(\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\\frac{1}{\sqrt{2}}&-\frac{1}{\sqrt{2}}\end{pmatrix}\). It could be easily seen that the inverse of this matrix is itself. Hence the transformation matrix from basis \(B\) to \(C\) would also be the same as above. Therefore,

\[A_{f,\,C}=\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\\frac{1}{\sqrt{2}}&-\frac{1}{\sqrt{2}}\end{pmatrix}\begin{pmatrix}3&-i\\i&3\end{pmatrix}\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\\frac{1}{\sqrt{2}}&-\frac{1}{\sqrt{2}}\end{pmatrix}=\begin{pmatrix}3&i\\-i&3\end{pmatrix}\]

Even if this is correct, I have the feeling that there should be an easier method. After all the answer is just multiplying the anti-diagonal entries of \(A_{f,\, B}\) with \(-1\). :)
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,780
Re: Finding and Orthonormal Basis

Hmm, it seems to me that $A_{f,B} v_1 \ne \lambda v_1$...
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Re: Finding and Orthonormal Basis

Hmm, it seems to me that $A_{f,B} v_1 \ne \lambda v_1$...
Sorry, as you see the eigenvectors aren't correct. Here's the correct answer,

The ​eigenvectors of \(A_{f,\,B}\) in terms of the standard basis are \(v_1=(1,\, i)\mbox{ and }v_2=(1,\,-i)\). It's clear that \(v_1\) and \(v_2\) are orthogonal under the complex dot product. To make this basis \(\{v_1,\,v_2\}\) orthonormal we shall divide each of the eigenvectors by their magnitudes. Hence,

\[C=\left\{\left( \frac{1}{\sqrt{2}}, \, \frac{i}{\sqrt{2}} \right), \, \left(\frac{1}{\sqrt{2}} ,\, -\frac{i}{\sqrt{2}} \right) \right\}\]

Now the transformation matrix from basis \(C\) to \(B\) would be, \(\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\\frac{i}{\sqrt{2}}&-\frac{i}{\sqrt{2}}\end{pmatrix}\). Therefore,

\[A_{f,\,C}=\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\\frac{i}{\sqrt{2}}&-\frac{i}{\sqrt{2}}\end{pmatrix}\begin{pmatrix}3&-i\\i&3\end{pmatrix}\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\\frac{i}{\sqrt{2}}&-\frac{i}{\sqrt{2}}\end{pmatrix}^{-1}=\begin{pmatrix}3&1\\1&3\end{pmatrix}\]
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,780
Hmm, the matrix of A with respect to an orthonormal basis of eigenvectors should be diagonal...
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Hmm, the matrix of A with respect to an orthonormal basis of eigenvectors should be diagonal...
Thanks much for the reply, but now I couldn't find the mistake. Is my whole approach wrong? :)
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,780
Thanks much for the reply, but now I couldn't find the mistake. Is my whole approach wrong? :)
Neh, you're approach is completely right. ;)
Just a small mistake with big consequences.

Suppose S is the transformation matrix from C to B.
Then
$$A_{f,B} = S\ A_{f,C}\ S^{-1}$$

It follows that:
$$A_{f,C} = S^{-1}\ A_{f,B}\ S$$

But I'm afraid that is not what you have...
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Neh, you're approach is completely right. ;)
Just a small mistake with big consequences.

Suppose S is the transformation matrix from C to B.
Then
$$A_{f,B} = S\ A_{f,C}\ S^{-1}$$

It follows that:
$$A_{f,C} = S^{-1}\ A_{f,B}\ S$$

But I'm afraid that is not what you have...
Ah... another careless mistake trying to do everything in one step... :p

\[A_{f,\,C}=T^{-1}_{C,\,B}A_{f,\,B}T_{C,\,B}\]

\[A_{f,\,C}=\begin{pmatrix}\frac{1}{\sqrt{2}}&\frac{ 1}{\sqrt{2}}\\\frac{i}{\sqrt{2}}&-\frac{i}{\sqrt{2}}\end{pmatrix}^{-1}\begin{pmatrix}3&-i\\i&3\end{pmatrix}\begin{pmatrix}\frac{1}{\sqrt{2 }}&\frac{1}{\sqrt{2}}\\\frac{i}{\sqrt{2}}&-\frac{i}{\sqrt{2}}\end{pmatrix}=\begin{pmatrix}4&0\\0&2\end{pmatrix}\]

So I pretty much get it now. Is it always that the transformation matrix with respect to a orthonormal basis is a diagonal matrix with the eigenvalues in it's diagonal? :)
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,780
Good!

Is it always that the transformation matrix with respect to a orthonormal basis is a diagonal matrix with the eigenvalues in it's diagonal?
Yep.
That is assuming that you have a transformation matrix consisting of eigenvectors.

Edit: you may want to polish that statement a little bit though, since it's not the transformation matrix that has the eigenvalues on its diagonal.
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Good!



Yep.
That is assuming that you have a transformation matrix consisting of eigenvectors.

Edit: you may want to polish that statement a little bit though, since it's not the transformation matrix that has the eigenvalues on its diagonal.
It's the matrix of the linear transformation with respect to the orthonormal basis. Isn't? Well isn't that called the transformation matrix, although I believe the word "transformation" matrix could be a little ambiguous since it could also mean the basis transformation matrix \(T\). Am I correct?
 

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,780
It's the matrix of the linear transformation with respect to the orthonormal basis. Isn't? Well isn't that called the transformation matrix, although I believe the word "transformation" matrix could be a little ambiguous since it could also mean the basis transformation matrix \(T\). Am I correct?
Correct.
Note that the matrix consisting of the eigenvectors does not have to be orthonormal.
Any set of independent eigenvectors will do.

If the matrix contains orthonormal eigenvectors that merely means that its inverse is the same as its conjugate transpose. Such a matrix is called unitary.

There is also another theorem called the spectral theorem.
If we have a matrix that is equal to its conjugate transpose (called hermitian), then it has an orthonormal basis of eigenvectors and all its eigenvalues are real.
This is applicable to your current problem.
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
Correct.
Note that the matrix consisting of the eigenvectors does not have to be orthonormal.
Any set of independent eigenvectors will do.

If the matrix contains orthonormal eigenvectors that merely means that its inverse is the same as its conjugate transpose. Such a matrix is called unitary.

There is also another theorem called the spectral theorem.
If we have a matrix that is equal to its conjugate transpose (called hermitian), then it has an orthonormal basis of eigenvectors and all its eigenvalues are real.
This is applicable to your current problem.
Thanks so much for all your help. I truly appreciate every bit and piece of it. :)

Yes, indeed. So the matrix of linear transformation, \(A_{f,\, B}\) is Hermitian and hence it has an orthonormal basis. I will look into the Spectral Theorem later. It's not covered in our class so not something that we need to know immediately. :)
 

Deveno

Well-known member
MHB Math Scholar
Feb 15, 2012
1,967
Suppose $A$ is an $n \times n$ matrix with an eigenbasis $v_1,\dots,v_n$. Let $D$ be the diagonal matrix $D = \text{diag}(\lambda_1,\dots,\lambda_n)$ where $\lambda_i$ is the eigenvalue corresponding to $v_i$.

If $P$ is the matrix whose columns are $v_1^T,\dots,v_n^T$, it is trivial to see that:

$AP = PD$, which tells us that $P^{-1}AP = D$ (that is, $P$ diagonalizes $A$).

In this particular case, we have 2x2 matrix with 2 distinct eigenvalues, so we're going to get an eigenbasis.

You should become comfortable with the fact that an expression like $B^{-1}AB$ (a SIMILARITY transform) essentially represents changing a matrix $A$ (which represents some linear transformation relative to some basis) to its matrix in another basis (and $B$ is called a "change-of-basis matrix"). The goal of employing such a technique is usually to get an easier form of $A$ to calculate with. In real-life situations, this often means we can recover matrix information by just measuring scalars (because linear transformations just scale eigenvectors).

If (as unfortunately happens) we don't have an eigenbasis, we can still (via the Jordan form) use a similaritly transform to bring $A$ to a form $D + N$, where $D$ is diagonal, and $N$ is nilpotent. In a sense, $N$ measures "how far from diagonal" our "standardized" form for $A$ is, and $P$ is then composed of eigenvectors and generalized eigenvectors.