Extracting eigenvectors from a matrix

In summary, the conversation discusses the problem of showing that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors. The characteristic equation and discriminant of the quadratic equation are used to show the eigenvalues must be real. The issue of proving the eigenvectors are orthogonal is also addressed, with the use of dot products and the fact that the matrix is symmetric. The discussion also touches upon the definition of eigenvectors and eigenvalues.
  • #1
Seydlitz
263
4
Hello,

Homework Statement


I want to show that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors.

$$
\begin{pmatrix}
A & H\\
H & B
\end{pmatrix}
$$

The Attempt at a Solution


For the matrix shown above it's clear that the charateristic equation will be
##\lambda^2-\lambda(A+B)+AB-H^2=0##

I can show that the discriminant of the quadratic equation will be greater than 0 implying that the eigenvalues must be real.
##b^2-4ac=(A+B)^2-4(AB-H^2)=A^2+2AB+B^2-4AB+4H^2##
##=(A-B)^2+4H^2##
Since ##A, B, H \in \mathbb{R}##, ##(A-B)^2+4H^2 \geq 0##

Knowing that ##\lambda## must be real for this matrix.

My only problem now is to show that the eigenvector is orthogonal.

The matrix has eigenvalues of ##\lambda_1, \lambda_2##, and hence eigenvectors ##\lambda_1v_1, \lambda_2v_2##.

How can I show,

##\lambda_1\lambda_2x_1x_2+\lambda_1\lambda_2x_1x_2=0##?

I know ##\lambda_1\lambda_2=det(M)##

It could become,

##det(M)(x_1x_2+y_1y_2)=0##

Then it's clear the vectors are orthogonal because ##det(M)## cannot be 0. But the problem is this is not a proof because I explicitly assume the dot product are 0 in the first place..

I tried substituting the complete quadratic equation into the matrix as if I know the lambda but then the matrix cannot be eliminated in simple manner and I got a mess real quick.
 
Physics news on Phys.org
  • #2
Seydlitz said:
Hello,

Homework Statement


I want to show that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors.

$$
\begin{pmatrix}
A & H\\
H & B
\end{pmatrix}
$$

The Attempt at a Solution


For the matrix shown above it's clear that the charateristic equation will be
##\lambda^2-\lambda(A+B)+AB-H^2=0##

I can show that the discriminant of the quadratic equation will be greater than 0 implying that the eigenvalues must be real.
##b^2-4ac=(A+B)^2-4(AB-H^2)=A^2+2AB+B^2-4AB+4H^2##
##=(A-B)^2+4H^2##
Since ##A, B, H \in \mathbb{R}##, ##(A-B)^2+4H^2 \geq 0##

Knowing that ##\lambda## must be real for this matrix.

My only problem now is to show that the eigenvector is orthogonal.

The matrix has eigenvalues of ##\lambda_1, \lambda_2##, and hence eigenvectors ##\lambda_1v_1, \lambda_2v_2##.
The eigenvectors are ##v_1## and ##v_2##, not ##\lambda_1 v_1## and ##\lambda_2 v_2##. This matters because the eigenvalue could be 0 and ##\vec{0}## can't be an eigenvector by definition.

I know ##\lambda_1\lambda_2=det(M)##

It could become,

##det(M)(x_1x_2+y_1y_2)=0##

Then it's clear the vectors are orthogonal because ##det(M)## cannot be 0.
det(M) could be 0 if either of the eigenvalues is 0.


Assume ##v_1## and ##v_2## are eigenvectors corresponding to distinct eigenvalues, and then consider the dot products ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2##. Using the fact that M is symmetric, you can show the two products are equal.
 
  • #3
vela said:
The eigenvectors are ##v_1## and ##v_2##, not ##\lambda_1 v_1## and ##\lambda_2 v_2##. This matters because the eigenvalue could be 0 and ##\vec{0}## can't be an eigenvector by definition.

det(M) could be 0 if either of the eigenvalues is 0.

Assume ##v_1## and ##v_2## are eigenvectors corresponding to distinct eigenvalues, and then consider the dot products ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2##. Using the fact that M is symmetric, you can show the two products are equal.

Ok I'll keep the important notation in mind, it never occurred to me that ##\vec{0}## is not valid.

I also forget about the case if eigenvalues is 0.

So in matrix notation ##v_1 \cdot M v_2## can be written as ##v_1^{\top}Mv_2## and ##(M v_1)\cdot v_2## as ##(Mv_1)^{\top}v_2##.

##(Mv_1)^{\top}v_2## is by transpose theorem and the fact that ##M## is symmetric, ##v_1^{\top}Mv_2##. Because ##v_2## is eigenvector, we can get ##v_1^{\top}v_2=v_1^{\top}v_2## equality, which implies that the dot product is zero. Is this correct?

Edit, I forget that applying the transformation matrix will result in unknown factor of ##\lambda## instead of just 1.

So Because ##v_2## is eigenvector, we can get ##\lambda_1v_1^{\top}v_2## and ##\lambda_2v_1^{\top}v_2## after applying the transformation matrix, hence ##\lambda_1v_1^{\top}v_2=\lambda_2v_1^{\top}v_2##, and that implies the dot product is 0 because there's two different eigenvalues.
 
Last edited:
  • #4
vela said:
##\vec{0}## can't be an eigenvector by definition.

Seydlitz said:
it never occurred to me that ##\vec{0}## is not valid.
Actually that's a bit of an argument and I take the other side. Yes, the definition of "eigenvalue" is "[itex]\lambda[/itex]" is an eigenvalue of linear operator A if and only if there exist a non-zero vector, v, such that [itex]Av= \lambda v[/itex].

But some texts books define "eigenvector corresponding to eigenvalue [itex]\lambda[/itex]" as "a non-zero vector, v, such that [itex]Av= \lambda v[/itex] while other textbooks do NOT require "non-zero". I prefer the latter because with the former you have to keep saying "and the 0 vector" in statements about eigenvectors. For example, I think it is preferable to be able to say "the set of all eigenvectors corresponding to eigenvalue [itex]\lambda[/itex] form a vector space" rather than "the set of all eigenvalues together with the zero vector".

In practice, of course, it doesn't make any difference. We still need to use non-zero eigenvectors to form a basis of that subspace.
 
  • #5
HallsofIvy said:
Actually that's a bit of an argument and I take the other side. Yes, the definition of "eigenvalue" is "[itex]\lambda[/itex]" is an eigenvalue of linear operator A if and only if there exist a non-zero vector, v, such that [itex]Av= \lambda v[/itex].

But some texts books define "eigenvector corresponding to eigenvalue [itex]\lambda[/itex]" as "a non-zero vector, v, such that [itex]Av= \lambda v[/itex] while other textbooks do NOT require "non-zero". I prefer the latter because with the former you have to keep saying "and the 0 vector" in statements about eigenvectors. For example, I think it is preferable to be able to say "the set of all eigenvectors corresponding to eigenvalue [itex]\lambda[/itex] form a vector space" rather than "the set of all eigenvalues together with the zero vector".

In practice, of course, it doesn't make any difference. We still need to use non-zero eigenvectors to form a basis of that subspace.

I hope I can fully appreciate this difference in definition as I go further with my study. It hasn't yet sunk in now. Maybe it's because the textbook is not aimed for rigorous study of linear algebra.

Additionally if one of the mentors is still reading this thread, I'd like to know if this method of dot product ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2## can be used to prove the fact about eigenvectors in other cases e.g when M is real but not symmetric, and I want to show that the dot product is not equal, or when M is Hermitian.
 
  • #6
You've shown if the eigenvalues are distinct, the eigenvectors are orthogonal. Now you have to deal with the case where ##\lambda_1 = \lambda_2##. The first thing you need to consider is whether you can find two independent eigenvectors in this case.
 
  • #7
vela said:
You've shown if the eigenvalues are distinct, the eigenvectors are orthogonal. Now you have to deal with the case where ##\lambda_1 = \lambda_2##. The first thing you need to consider is whether you can find two independent eigenvectors in this case.

If ##\lambda_1 = \lambda_2##, then ##H## must be 0, and ##A=B##. This is will result in ##kI## matrix, where ##k## is a constant and ##I## unit matrix, ##\lambda_1 = \lambda_2=k## Because it's a multiple of unit matrix, any vector in the vector space considered is eigenvector. In two dimensional case we can find two orthogonal basis vector like ##(0,1)## and ##(1,0)##.
 

Related to Extracting eigenvectors from a matrix

1. How do you extract eigenvectors from a matrix?

To extract eigenvectors from a matrix, you first need to find the eigenvalues of the matrix. Then, for each eigenvalue, you can solve the equation (A-λI)x = 0, where A is the original matrix, λ is the eigenvalue, and x is the eigenvector. This will give you a set of eigenvectors corresponding to each eigenvalue.

2. Why is it important to extract eigenvectors from a matrix?

Eigenvectors are important because they represent the directions along which a linear transformation only stretches or compresses, without changing its direction. They are also used in various applications such as data analysis, image processing, and machine learning.

3. Can a matrix have more than one eigenvector?

Yes, a matrix can have multiple eigenvectors corresponding to the same eigenvalue. In fact, there can be an infinite number of eigenvectors for a single eigenvalue, as long as they are linearly independent.

4. Can a matrix have zero eigenvectors?

Yes, a matrix can have zero eigenvectors if it has no eigenvalues. This can happen in cases where the matrix is non-invertible or has complex eigenvalues.

5. How are eigenvectors used in principal component analysis (PCA)?

In PCA, eigenvectors are used to determine the principal components, which are new variables that are linear combinations of the original variables and capture the most variation in the data. These principal components are then used to reduce the dimensionality of the data and simplify the analysis.

Similar threads

  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
307
  • Calculus and Beyond Homework Help
Replies
2
Views
466
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
19
Views
3K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
384
  • Calculus and Beyond Homework Help
Replies
2
Views
620
  • Calculus and Beyond Homework Help
Replies
12
Views
2K
Back
Top