- #1
el_llavero
- 29
- 0
I am somewhat confused about this property of an eigenvalue when A is a symmetric matrix, I will state it exactly as it was presented to me.
"Properties of the eigenvalue when A is symmetric.
If an eigenvalue [tex]\lambda[/tex] has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."
So I decided to test this property with a few matrices and I encountered one particular matrix that may provide a counterexample to this property.
()^T == column vector
A symmetric matrix: A= (5,4,2),(4,5,2),(2,2,2) with corresponding eigenvalue eigenvector pairs:
(-1,1,0)^T,(-1/2,0,1)^T map to eigenvalue 1
(2,2,1)^T map to eigenvalue 10
eigenvalue 1 has multiplicity 2, the eigenvectors corresponding to 1: (-1,1,0)^T,(-1/2,0,1)^T , their dot product: (-1,1,0)^T (dot) (-1/2,0,1)^T = 1/2, not 0, therefore eigenvectors corresponding to this root are not orthogonal to each other, however they are linearly idependent.
So I asked the author of these notes if i perhaps misinterpreted this property, is the correct interpretation perhaps,
If an eigenvalue has multiplicity k, there will be k (repeated k times), eigenvectors corresponding to this root that are orthogonal to eigenvectors corresponding to different roots?
and the author replied
"Yes, you are right that the two eigenvectors corresponding root one are not orthogonal. But the property says there ARE orthogonal ones. So you can keep looking for orthogonal ones.
How about the pair of (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?"
So just to restate and compare
If an eigenvalue [tex]\lambda[/tex] has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."
but the author says "that the two eigenvectors corresponding root one are not orthogonal"
and then says that the property says "there ARE orthogonal ones. So you can keep looking for orthogonal ones."
Can someone help me make sense of this?
- I found the eigenvectors corresponding to the eigenvalue(root),
- there were two so the eigenvalue has multiplicity 2
- however the 2 eigenvectors (corresponding to) eigenvalue (root) were not orthogonal v1 (dot) v2 != 0
**** But I'm supposed to keep looking for orthogonal ones ?
The property says:
there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."
The author says:
that the two eigenvectors corresponding root one are not orthogonal
(The property says) != (The author says)
but I'm supposed to keep looking for orthogonal ones?? I'm supposed to somehow derive the pair (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?
Does this make sense to anyone? If so, please help me make sense of it. and enlighten me as to how I can derive the pair (1,-1,0) and (1,1,-4). thanks in advance
"Properties of the eigenvalue when A is symmetric.
If an eigenvalue [tex]\lambda[/tex] has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."
So I decided to test this property with a few matrices and I encountered one particular matrix that may provide a counterexample to this property.
()^T == column vector
A symmetric matrix: A= (5,4,2),(4,5,2),(2,2,2) with corresponding eigenvalue eigenvector pairs:
(-1,1,0)^T,(-1/2,0,1)^T map to eigenvalue 1
(2,2,1)^T map to eigenvalue 10
eigenvalue 1 has multiplicity 2, the eigenvectors corresponding to 1: (-1,1,0)^T,(-1/2,0,1)^T , their dot product: (-1,1,0)^T (dot) (-1/2,0,1)^T = 1/2, not 0, therefore eigenvectors corresponding to this root are not orthogonal to each other, however they are linearly idependent.
So I asked the author of these notes if i perhaps misinterpreted this property, is the correct interpretation perhaps,
If an eigenvalue has multiplicity k, there will be k (repeated k times), eigenvectors corresponding to this root that are orthogonal to eigenvectors corresponding to different roots?
and the author replied
"Yes, you are right that the two eigenvectors corresponding root one are not orthogonal. But the property says there ARE orthogonal ones. So you can keep looking for orthogonal ones.
How about the pair of (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?"
So just to restate and compare
If an eigenvalue [tex]\lambda[/tex] has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."
but the author says "that the two eigenvectors corresponding root one are not orthogonal"
and then says that the property says "there ARE orthogonal ones. So you can keep looking for orthogonal ones."
Can someone help me make sense of this?
- I found the eigenvectors corresponding to the eigenvalue(root),
- there were two so the eigenvalue has multiplicity 2
- however the 2 eigenvectors (corresponding to) eigenvalue (root) were not orthogonal v1 (dot) v2 != 0
**** But I'm supposed to keep looking for orthogonal ones ?
The property says:
there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."
The author says:
that the two eigenvectors corresponding root one are not orthogonal
(The property says) != (The author says)
but I'm supposed to keep looking for orthogonal ones?? I'm supposed to somehow derive the pair (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?
Does this make sense to anyone? If so, please help me make sense of it. and enlighten me as to how I can derive the pair (1,-1,0) and (1,1,-4). thanks in advance