Eigenvalue with multiplicity k resulting in k orthogonal eigenvectors?

In summary, the conversation discusses the properties of eigenvalues and eigenvectors when a matrix is symmetric. It is stated that for a symmetric matrix, eigenvalues corresponding to different eigenvalues will be orthogonal. Additionally, for an eigenvalue with multiplicity k, there will be k orthogonal eigenvectors corresponding to this root. However, it is also mentioned that there can be an infinite number of eigenvectors for any eigenvalue, and not all of them will be orthogonal. The author clarifies that when discussing "the two eigenvectors," they are referring to the two specific eigenvectors that were found, not the only two eigenvectors that exist.
  • #1
el_llavero
29
0
I am somewhat confused about this property of an eigenvalue when A is a symmetric matrix, I will state it exactly as it was presented to me.

"Properties of the eigenvalue when A is symmetric.
If an eigenvalue [tex]\lambda[/tex] has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."

So I decided to test this property with a few matrices and I encountered one particular matrix that may provide a counterexample to this property.


()^T == column vector

A symmetric matrix: A= (5,4,2),(4,5,2),(2,2,2) with corresponding eigenvalue eigenvector pairs:
(-1,1,0)^T,(-1/2,0,1)^T map to eigenvalue 1
(2,2,1)^T map to eigenvalue 10

eigenvalue 1 has multiplicity 2, the eigenvectors corresponding to 1: (-1,1,0)^T,(-1/2,0,1)^T , their dot product: (-1,1,0)^T (dot) (-1/2,0,1)^T = 1/2, not 0, therefore eigenvectors corresponding to this root are not orthogonal to each other, however they are linearly idependent.

So I asked the author of these notes if i perhaps misinterpreted this property, is the correct interpretation perhaps,

If an eigenvalue has multiplicity k, there will be k (repeated k times), eigenvectors corresponding to this root that are orthogonal to eigenvectors corresponding to different roots?

and the author replied

"Yes, you are right that the two eigenvectors corresponding root one are not orthogonal. But the property says there ARE orthogonal ones. So you can keep looking for orthogonal ones.

How about the pair of (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?"

So just to restate and compare

If an eigenvalue [tex]\lambda[/tex] has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."

but the author says "that the two eigenvectors corresponding root one are not orthogonal"
and then says that the property says "there ARE orthogonal ones. So you can keep looking for orthogonal ones."

Can someone help me make sense of this?

- I found the eigenvectors corresponding to the eigenvalue(root),
- there were two so the eigenvalue has multiplicity 2
- however the 2 eigenvectors (corresponding to) eigenvalue (root) were not orthogonal v1 (dot) v2 != 0
**** But I'm supposed to keep looking for orthogonal ones ?

The property says:

there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."


The author says:

that the two eigenvectors corresponding root one are not orthogonal

(The property says) != (The author says)

but I'm supposed to keep looking for orthogonal ones?? I'm supposed to somehow derive the pair (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?

Does this make sense to anyone? If so, please help me make sense of it. and enlighten me as to how I can derive the pair (1,-1,0) and (1,1,-4). thanks in advance
 
Physics news on Phys.org
  • #2
I'm not sure what part is causing you trouble. It is true that, for a symmetric matrix, eigenvalues corresponding to different eigenvalues will be orthogonal. But you seem to be under the impression that the two eigenvectors you found are the only eigenvectors. You quote the author as saying "Yes, you are right that the two eigenvectors corresponding root one are not orthogonal." (added emphasis). I doubt he/she said exactly that. More like he/she said that those two eigenvectors that you found were not orthogonal. It is just wrong to talk about "the" two eigenvectors. There are always an infinite number of eigenvectors for any eigenvalue.

It is also true that the eigenvectors corresponding to a single eigenvalue of multiplicity k (again for a symmetric matrix) span a subspace of dimension k. Of course, you can always construct an orthonormal basis for the subspace so there will be k orthogonal eigenvectors.

But any subspace will also contain eigenvectors that are NOT orthogonal. In your example, any eigenvectors corresponding to eigenvalue 1 must satisfy 2x+ 2y+ z= 0 or z= -2x- 2y. The easy way to get eigenvectors that span the "eigenspace" is take x= 1, y= 0 to get z= -2 giving an eigenvector <1, 0, -2> and then take x= 0, y= 1 to get z= -2 again giving another eigenvector <0, 1, -2>. Those are not orthogonal since their dot product is 4 but if, instead of x= 1, y= 0, you take x= -4, y= 5, you get z= 8- 10= -2 and so <-4, 5, -2> is an eigenvector. <1, 0, -2>.<-4, 5, -2>= 0. Those eigenvectors are orthogonal.
 
Last edited by a moderator:
  • #3
HallsofIvy said:
I'm not sure what part is causing you trouble. It is true that, for a symmetric matrix, eigenvalues corresponding to different eigenvalues will be orthogonal. But you seem to be under the impression that the two eigenvectors you found are the only eigenvectors. You quote the author as saying "Yes, you are right that the two eigenvectors corresponding root one are not orthogonal." (added emphasis). I doubt he/she said exactly that. More like he/she said that those two eigenvectors that you found were not orthogonal. It is just wrong to talk about "the" two eigenvectors. There are always an infinite number of eigenvectors for any eigenvalue.


Let me add a disclaimer to my post:
I have not taken a linear algebra class, however the course I will be taking this fall09 semester uses a lot of linear algebra, where the main linear algebra techniques are covered in a preliminary pdf. I think I have done a good job of self learning many of the topics in that pdf, however in the resources I've been using there is nothing regarding properties of the eigenvalue when a matrix is symmetric or properties of eigenvalues and eigenvectors in general, other then how to derive them from a matrix and some other basic information .

HallsofIvy,


For not being sure what part is causing me trouble you really clarified many things for me. Unfortunately, I did quote the author word for word and he did say " the two eigenvectors corresponding root one." This was probably my main source of confusion, I was assuming there were only two eigenvectors, which I now know is wrong. I never encountered any information that explicitly stated "there are always an infinite number of eigenvectors for any eigenvalue" and the author's reply didn't shed much light on this either. I am familiar with many of the concepts you talk about in the rest of your post, I'll comment on the rest of your post after I've gone through it more thoroughly and gotten a handle on the infinite eigenvectors concept.
 
  • #4
I went back in the primary book I'm using and saw that i was dealing with a homogeneous system of linear equations, because all the constant terms are zero, therefore there are many solutions. Then I read further into deriving general solutions, in this case

In terms of row vectors, we can express the general solution as: (-x₂-(1/2)x₃,x₂,x₃) and seperating the vairables in the general solution results in
(-x₂-(1/2)x₃,x₂,x₃)=x₂(-1,1,0)+x₃(-1/2,0,1), however to make the vectors easier to work with we'll turn the fraction into a whole number by writing the general solution in terms of x₁ and x₂, i.e., x₁=-x₂-(1/2)x₃, -(1/2)x₃=x₁+x₂, x₃=-2x₁-2x₂, giving the general solution (x₁,x₂,-2x₁-2x₂) to get the basis, separate the variables in the general solution
(x₁,x₂,-2x₁-2x₂) = x₁(1,0,-2)+x₂(0,1,-2)

For two vectors to be orthogonal they must have the propery that their dot product is equal to zero. For a subspace W with dimension 2 in a subspace of R³

where the basis for W is the set of vectors{ (a1, a2, a3), (b1, b2, b3)}
The vectors must satisfy a1*b1+ a2*b2+ a3*b3=0

in our particular case the basis for the subspace is the set of vectors {(1,0,-2), (0,1,-2)}
with the corresponding general solution (x₁,x₂,-2x₁-2x₂)

to get two vectors orthogonal to each other with the same basis solve for one of the variables in the equation x₁+x₂-4x₁x₂=0

x₁= ((x₂)/(4x₂-1)) if x₂≠(1/4)

plug in a value for x₁and x₂into the general solution (x₁,x₂,-2x₁-2x₂) that satisfy x₁= ((x₂)/(4x₂-1)) if x₂≠(1/4), then separate the variables in the general solution and you got yourself two orthogonal vectors.
 
  • #5
I think i may have some inaccuracies in the above post.

Nevertheless I've cleared up my confusion with these orthogonal eigenvectors and how they preserve all the properties of spaces. I saw how you plugged in values for x and y in order to cancel out the 4. However I did some more research and found a general way of obtaining an orthogonal set from a set of linearly independent vectors, and to then produce an orthonormal set. It's called the Gram-Schmidt orthogonalization process.
 

Related to Eigenvalue with multiplicity k resulting in k orthogonal eigenvectors?

1. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important concepts in linear algebra. Eigenvalues are scalar values that represent the scaling factor of a given linear transformation, while eigenvectors are the corresponding non-zero vectors that remain in the same direction after the transformation.

2. What is the significance of an eigenvalue with multiplicity k?

An eigenvalue with multiplicity k means that there are k linearly independent eigenvectors corresponding to that eigenvalue. This can be interpreted as having k different directions in which the linear transformation does not change the direction of the vector.

3. How are eigenvalues and eigenvectors calculated?

Eigenvalues and eigenvectors can be calculated by solving the characteristic equation of a given matrix. The characteristic equation is obtained by setting the determinant of the matrix minus a scalar multiple of the identity matrix equal to zero.

4. What is the relationship between eigenvalues, eigenvectors, and diagonalization?

Eigenvalues and eigenvectors are crucial in diagonalization, which is the process of finding a diagonal matrix that is similar to a given matrix. The eigenvalues of the given matrix form the diagonal entries of the diagonal matrix, and the eigenvectors form the columns of the transformation matrix.

5. How are orthogonal eigenvectors related to eigenvalues with multiplicity k?

When an eigenvalue has multiplicity k, it means that there are k orthogonal eigenvectors corresponding to that eigenvalue. This means that the eigenvectors are perpendicular to each other, and the transformation preserves their orthogonality. This is useful in applications such as principal component analysis, where we want to find a set of orthogonal eigenvectors that capture the most variance in a dataset.

Similar threads

Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
8K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
591
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
2
Views
5K
  • Introductory Physics Homework Help
Replies
1
Views
711
  • Calculus and Beyond Homework Help
Replies
2
Views
578
  • Advanced Physics Homework Help
Replies
29
Views
2K
Back
Top