General Solution for Eigenvalues for a 2x2 Symmetric Matrix

In summary: Then:$$\lambda_1 x_1^* x_2 = x_1^* Mx_2 = x_1^* \lambda_2 x_2 = \lambda_2 x_1^* x_2$$where I've used the fact that ##M## is hermitian and that ##x_1## and ##x_2## are eigenvectors. Now subtract the two sides and get:$$(\lambda_1 - \lambda_2) x_1^* x_2
  • #1
kq6up
368
13

Homework Statement



From Mary Boas' "Mathematical Methods in the Physical Science" 3rd Edition Chapter 3 Sec 11 Problem 33 ( 3.11.33 ).

Find the eigenvalues and the eigenvectors of the real symmetric matrix.

$$M=\begin{pmatrix} A & H \\ H & B \end{pmatrix}$$

Show the eigenvalues are real and the eigenvectors are perpendicular.

Homework Equations



$$D={ C }^{ -1 }MC$$

The Attempt at a Solution



The second part of the problem was easily proven using a variation of the proof with hermitian matrices.

The first part produces horrible algebraic messes with the two different ways I have approached this. For example, click the link:

https://www.wolframalpha.com/input/?i=determinant+{{a-x,H},{H,b-x}}=0

Is there an elegant way to find a general solution for the 2x2 symmetric matrix? No spoilers, but hints appreciated.

Thanks,
Chris Maness
 
Last edited:
Physics news on Phys.org
  • #2
kq6up said:
The first part produces horrible algebraic messes with the two different ways I have approached this. Is there an elegant way to find a general solution for the 2x2 symmetric matrix? No spoilers, but hints appreciated.
We can answer this part of the question without having to calculate the eigenvalues at all:
Show the eigenvalues are real
Indeed, this is true for any real symmetric matrix (or more generally, any hermitian matrix), not just 2x2. Let ##M## be any hermitian matrix, so ##M^* = M##. If ##\lambda## is an eigenvalue and ##x## is a corresponding eigenvector, then by definition ##Mx = \lambda x##. Therefore ##x^* M x = \lambda(x^* x)##. Since ##x^* M x## and ##x^* x## are both real numbers (why?), this means ##\lambda## is also real.

This part is not necessarily true without further assumptions:
and the eigenvectors are perpendicular
For example, if ##A=B=1## and ##H = 0## then ##M## is the identity matrix. Every nonzero vector is an eigenvector of the identity matrix, with eigenvalue equal to 1. But it is possible to choose two orthogonal eigenvectors.

Finally, as for explicitly calculating the eigenvalues in the 2x2 case, did you try simply calculating the determinant of ##M - \lambda I## and setting it equal to zero?
$$\det\left(\begin{matrix}
A - \lambda & H \\
H & B - \lambda \\
\end{matrix}\right) = 0$$
This should give you a quadratic equation, and you know how to solve those...
 
  • #3
kq6up said:

Homework Statement



From Mary Boas' "Mathematical Methods in the Physical Science" 3rd Edition Chapter 3 Sec 11 Problem 33 ( 3.11.33 ).

Find the eigenvalues and the eigenvectors of the real symmetric matrix.

$$M=\begin{pmatrix} A & H \\ H & B \end{pmatrix}$$

Show the eigenvalues are real and the eigenvectors are perpendicular.

Homework Equations



$$D={ C }^{ -1 }MC$$


The Attempt at a Solution



The second part of the problem was easily proven using a variation of the proof with hermitian matrices.

The first part produces horrible algebraic messes with the two different ways I have approached this. For example, click the link:

https://www.wolframalpha.com/input/?i=determinant+{{a-x,H},{H,b-x}}=0

Is there an elegant way to find a general solution for the 2x2 symmetric matrix? No spoilers, but hints appreciated.

Thanks,
Chris Maness

Show your work; for all we know, maybe what you have already is about the best way to do the question---or maybe not. How can we tell? Your wolfram link gets the eigenvalues OK, but what about the eigenvectors?
 
Last edited:
  • #4
yeah, maybe there is no nice way to do it. you could use your relevant equation. But I don't think that is super-simple to do by hand either.
 
  • #5
Ok, that is all I wanted to know. Because I know it could be done a long tedious way, but I get the concept, so I don't see to much point in dealing with a huge algebra mess. I am self studying, so I am the one picking which questions I do. I can almost see a pattern with the simple symmetric matrices, but none of my little rules worked for all of them. I worked at that for some time last night.

Chris Maness
 
  • #6
kq6up said:
The only possible problem would be with the radical ##\sqrt{a^2-2ab+b^2+4H^2}##, and that is obviously real (why?).


Is there an elegant way to find a general solution for the 2x2 symmetric matrix? No spoilers, but hints appreciated.
jbunniii's hint is much more elegant. It works on any NxN real symmetric matrix. Dealing with determinants gets very messy with N=3, and downright ugly for any larger N.
 
  • #7
I thought that is what I did in the wolfram link -- that is set the determinate of the aforementioned equation equal to zero. Yes, I worked two 5x5 matrices before, and I swore them off. That is why God made computers :D

Thanks,
Chris Maness
 
  • #8
You didn't. Look at your radicand, ##a^2-2ab+b^2+4H^2##. The ##4H^2## term is obviously positive. What can you say about ##a^2-2ab+b^2##?

Hint: The sum of two non-negative numbers is ... ?
 
  • #9
Yes, real and positive. I got that earlier (stated above), but here it is for fun:

Start with two of the same equations:

$$\hat { H } { r }_{ 1 }=\lambda { r }_{ 1 }$$

For the first one take the $$\dagger $$ of both sides. This yields:

$$\hat {({ H } { r }_{ 1 })}^{\dagger}={(\lambda { r }_{ 1 })}^{\dagger}$$ is identical to:

$${ { r }_{ 1 } }^{ \dagger }\hat { H } ={ { \lambda }^{ \star }{ r }_{ 1 } }^{ \dagger }$$

multiply the right side of both sides by regular ol' $$r_{1}$$

Multiply the left side of the non dagger equation by r 1 dagger. This is the inner product and equals a non zero real/positive number. Subtract the two and the left side goes to zero, so you have:

$$0={ \left( { \lambda }_{ 1 }-{ \lambda }_{ 1 }^{ \star } \right) { r }_{ 1 } }^{ \dagger }{ { r }_{ 1 } }$$

Since the inner product is not zero, the lamdas have to equal. Since one is starred, it has to be wholly real.

Thanks,
Chris Maness
 
  • #10
Regarding the orthogonality of the different eigenvectors, as I noted above it's not automatically true: the identity matrix is a counterexample where every nonzero vector is an eigenvector.

But what is true is that for a real symmetric (or more generally, hermitian) matrix, eigenvectors corresponding to distinct eigenvalues are orthogonal. To see this, suppose that ##\lambda## and ##\mu## are eigenvalues, with corresponding eigenvectors ##x## and ##y##. Assume further that ##\lambda \neq \mu##. Then we have, by definition,
$$Mx = \lambda x \text{ and } My = \mu y$$
Now premultiply the first equation by ##y^*## and the second equation by ##x^*## to obtain
$$y^*Mx = \lambda y^*x \text{ and } x^*My = \mu x^* y$$
Now conjugate the first equation and compare with the second, using the fact that the eigenvalues are real.
 
  • #11
jbunniii said:
Regarding the orthogonality of the different eigenvectors, as I noted above it's not automatically true: the identity matrix is a counterexample where every nonzero vector is an eigenvector.

That is true, but the only 2x2 symmetric (or hermitian) matrices with two equal eigenvalues are multiples of the identity matrix, which is a rather trivial special case.

Of course bigger hermitian matrices can have equal eigenvalues and have non-zero off-diagonal terms as well.

When a hermitian matrix has repeated eigenvalues, you can always find a complete set of orthogonal eigenvectors (which is a very useful property of the matrix), even though the vectors corresponding to the repeated values are not unique
 
  • #12
AlephZero said:
That is true, but the only 2x2 symmetric (or hermitian) matrices with two equal eigenvalues are multiples of the identity matrix, which is a rather trivial special case.
True, but this special case was not excluded in the problem statement, as far as I can tell.
When a hermitian matrix has repeated eigenvalues, you can always find a complete set of orthogonal eigenvectors (which is a very useful property of the matrix), even though the vectors corresponding to the repeated values are not unique
Yes, I assume the intent of the question is that it's always possible to make the eigenvectors orthogonal. This is clearly true in the 2x2 case, since as you said, the only situation where it's not automatic is for multiples of the identity. But the general case is more involved - essentially the spectral theorem.
 
  • #13
even if we just keep to 2x2 case, and if we exclude M from being a multiple of the identity matrix, there is still no 'nice' way to find the eigenvectors or eigenvalues, right? I think this was kq6up's main question. I would also be interested to know if there is a nice way to do it, since I can't think of any, and some elegant method would be super-useful :)

edit: I guess it's not too much work to just compute the determinant to find eigenvalues, then plug into find eigenvectors. But it's annoying, to do all this by hand, for a problem that looks like it could have a nice method to solve it.
 
  • #14
I guess it depends what you call "nice". Personally I wouldn't call the the quadratic equation you have to solve a "horrible algebraic mess." The form of the equation ##x^2 - (a+b)x + ab - h^2 = 0## gives you some insight into what is going on. For example, the sum of the roots is ##a+b##, independent of ##h##, and that is an example of a general result for any size of matrix, An eigenvalue is zero only if ##ab -h^2 = 0##, which is what you would expect, since ##ab - h^2## is the determinant of the matrix.

I think it's optimistic to expect any "nice" method here. If you know all the eigenvalues and vectors of a matrix, then any calculation involving the matrix is simple, because you can easily diagonalize the matrix. So you shouldn't expect to a "free lunch" by finding a simple way to solve the eigenproblem! Any general of finding all the eigenvalues and vectors for an ##n\times n## matrix requires of the order of ##n^3## arithmetic operations.
 
Last edited:
  • #15
BruceW said:
edit: I guess it's not too much work to just compute the determinant to find eigenvalues, then plug into find eigenvectors. But it's annoying, to do all this by hand, for a problem that looks like it could have a nice method to solve it.
Probably the only truly "nice" case is 1x1. :biggrin:

Things become rapidly more complicated for 3x3 and 4x4, since the formulas for solving cubic and quartic equations are nasty. For 5x5 and above, there is no general formula at all, and it becomes a rather hard problem in numerical analysis. Here is a dense 680 page classic treatise on the subject, and it's old enough that it is nowhere near the current state of the art: The Algebraic Eigenvalue Problem by Wilkinson. The symmetric case gets its own treatment at only 416 pages: The Symmetric Eigenvalue Problem by Parlett.
 
  • #16
You don't have to read the whole of Wilkinson or Parlett to get started. This is a perfectly good method for "small" symmetric matrices (i.e. less than about 100 x 100): http://en.wikipedia.org/wiki/Jacobi_eigenvalue_algorithm

Unlike some methods that are faster, it guarantees the calculated eigenvectors are orthogonal, even when there are closely spaced eigenvalues. That property means it is still used as part of more complicated algorithms.

If you work through the algebra, it gives an alternative method of solving the 2x2 case (no iterations are required for a 2x2 matrix). The solution again involves a square root, which is no surprise, because the right answer is independent of how you do the algebra.
 
  • #17
AlephZero said:
I think it's optimistic to expect any "nice" method here. If you know all the eigenvalues and vectors of a matrix, then any calculation involving the matrix is simple, because you can easily diagonalize the matrix. So you shouldn't expect to a "free lunch" by finding a simple way to solve the eigenproblem! Any general of finding all the eigenvalues and vectors for an ##n\times n## matrix requires of the order of ##n^3## arithmetic operations.
it's not possible to diagonalize any general square matrix. But yeah, it is pretty optimistic to expect a nice solution for symmetric matrices. I was hoping that the 2x2 matrix would be a special case with a nice way to get the answer. But I guess not. I'm always looking for that free lunch, anyway :)
 
  • #18
Yes, I have noticed in working problems in Boas if I have even a relatively ugly answer, then I am usually missing the point. She gives many problems with clean/elegant solutions. I don't have a solution manual for the third edition (if one even exist since she died shortly after its publication date). Chapter 3 has already ended in the 2nd edition, but it mostly matches the problems in the third edition. However, she put more sections at the end of chapter 3.

Regards,
Chris Maness
 

Related to General Solution for Eigenvalues for a 2x2 Symmetric Matrix

1. What is a symmetric matrix?

A symmetric matrix is a square matrix where the elements above and below the main diagonal are reflections of each other. In other words, the matrix is equal to its own transpose.

2. What is an eigenvalue?

An eigenvalue is a scalar (single value) that represents a special characteristic of a matrix. It is often denoted by the Greek letter lambda (λ) and is used to describe how the matrix transforms a vector in a particular direction.

3. Why is finding eigenvalues important?

Finding eigenvalues allows us to understand how a matrix affects a vector in a given direction. This is useful in many applications, including physics, engineering, and computer graphics.

4. How do you find the general solution for eigenvalues of a 2x2 symmetric matrix?

The general solution for eigenvalues of a 2x2 symmetric matrix can be found by solving the characteristic equation, which is given by det(A-λI)=0, where A is the matrix and λ is the eigenvalue. This will result in a quadratic equation, which can be solved using the quadratic formula. The resulting eigenvalues will be the general solution.

5. Can a 2x2 symmetric matrix have complex eigenvalues?

Yes, a 2x2 symmetric matrix can have complex eigenvalues. This means that the eigenvalues will have both a real and imaginary component. This is common in certain types of matrices, such as those used in quantum mechanics.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
6K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
6K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
Replies
4
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
3K
  • Linear and Abstract Algebra
Replies
4
Views
4K
  • Linear and Abstract Algebra
Replies
8
Views
2K
Back
Top