Eigenvalues of a symmetric operator

In summary, the spectrum of any bounded symmetric operator is real and all its eigenvalues are real. However, there may be linear operators that do not have eigenvalues, such as the 2D rotation matrix which has no real eigenvalues. This is because the rotation matrix is not symmetric or Hermitian. Additionally, while every square matrix of size n will have n eigenvalues (counting multiplicities) if complex numbers are allowed, this does not apply to operators on infinite-dimensional Hilbert spaces. In fact, symmetric operators on infinite-dimensional spaces may not have eigenvalues, as exemplified by the multiplication operator on a monotone increasing, bounded function.
  • #1
psholtz
136
0
I'm reading from Wikipedia:
The spectrum of any bounded symmetric operator is real; in particular all its eigenvalues are real, although a symmetric operator may have no eigenvalues.

http://en.wikipedia.org/wiki/Self-adjoint_operator

I thought linear operators always had eigenvalues, since you could always form a characteristic equation for the corresponding matrix and solve it?

Is that not the case? Are there linear operators that don't have eigenvalues?
 
Physics news on Phys.org
  • #2
A 2D rotation matrix has no real eigenvalues. You can write down the equation, but quickly find there are no (real) solutions.
 
  • #3
That's true, but a 2D rotation matrix still has eigenvalues, they just aren't real eigenvalues. But the eigenvalues still exist.

Moreover, the 2D rotation matrix isn't symmetric/Hermitian. It's usually of the form:

[tex] T = \left(\begin{array}{cc} \cos\phi & \sin\phi \\ -\sin\phi & \cos\phi \end{array} \right) [/tex]

which is not symmetric/Hermitian.
 
  • #4
Reading more from Wikipedia:
It follows that we can compute all the eigenvalues of a matrix A by solving the equation pA(λ) = 0. If A is an n-by-n matrix, then pA has degree n and A can therefore have at most n eigenvalues. Conversely, the fundamental theorem of algebra says that this equation has exactly n roots (zeroes), counted with multiplicity.

http://en.wikipedia.org/wiki/Eigenvalue_algorithm

To me, it would seem that there must be n roots (counting multiplicities) for the characteristic polynomial for every square matrix of size n. In other words, every square matrix of size n must have n eigenvalues (counting multiplicities, i.e., eigenvalues are possibly non-distinct).

The only way I can reconcile the statement above, the "symmetric operators may not have eigenvalues" is if the symmetric operator so described is not square?? Is it possible to have a non-square matrix, and call it "symmetric" if only the "square" part of it is symmetric?
 
  • #5
Complex numbers are not always automatically assumed. If you are working in a real vector space, the rotation matrix is a simple example of a linear operator that doesn't have eigenvectors.
 
  • #6
Do you think that's what they were getting at in the Wikipedia article?

If we suppose the existence of complex numbers, or allow them at any rate, is it safe to say that a square matrix of size n will always have n eigenvalues (counting multiplicities)?
 
  • #7
psholtz said:
Do you think that's what they were getting at in the Wikipedia article?

If we suppose the existence of complex numbers, or allow them at any rate, is it safe to say that a square matrix of size n will always have n eigenvalues (counting multiplicities)?

Yes, and yes. (At least, I think so!)
 
  • #8
Nonono, this is not what the article is trying to say at all!

You are completely correct that a symmetric matrix will always have eigenvalues. But the article isn't talking about matrices, it's talking about operators. That is, bounded linear function on a possible infinite-dimensional Hilbert space.

The matrices correspond to operators on a finite-dimensional Hilbert space. The only thing that the article will say is that operators on an infinite-dimensional Hilbert space does not need to have eigenvalues.

The standard example: take a monotone increasing, bounded function [itex]u:[a,b]\rightarrow \mathbb{R}[/itex]. Then the operator

[tex]T:L^2([a,b])\rightarrow L^2([a,b]):f\rightarrow uf[/tex]

is called the multiplication operator. This is a hermitian/symmetric operator without eigenvalues. Indeed, if T(f)=cf for a complex number c and a nonzero f. Then uf=cf. So if [itex]f(x)\neq 0[/itex], then [itex]u(x)=\lambda[/itex]. But then u(x) equal lambda for multiple values of x, which contradicts that u is monotone increasing.

So the thing that the wikipedia article is getting to is that symmetric operators on infinite-dimensional Hilbert spaces do not necessarily have eigenvalues. However, symmetric operators on finite-dimensional spaces always have eigenvalues, since they are matrices.
 
  • #9
Thanks for that, micromass. A very instructive example and discussion! I was wrong, and I learned something.
 
  • #10
Thanks, micromass... excellent explanation.
 

Related to Eigenvalues of a symmetric operator

1. What are eigenvalues of a symmetric operator?

Eigenvalues of a symmetric operator are the values that, when multiplied by a vector, result in a scaled version of the same vector. They represent the possible values that a system can take on when it is in a stable state and are important in understanding the behavior of a system.

2. How are eigenvalues of a symmetric operator calculated?

The eigenvalues of a symmetric operator can be calculated by solving the characteristic equation, which is obtained by setting the determinant of the operator's matrix equal to zero. This equation yields the values of the eigenvalues.

3. What is the significance of eigenvalues of a symmetric operator in linear algebra?

Eigenvalues of a symmetric operator play a crucial role in linear algebra as they provide information about the behavior and stability of a system. They also help in understanding the transformation that the operator performs on a vector and can be used to find the eigenvectors of the operator.

4. Can the eigenvalues of a symmetric operator be negative?

Yes, the eigenvalues of a symmetric operator can be negative. In fact, a symmetric operator can have both positive and negative eigenvalues depending on the specific properties and operations of the system it represents.

5. How do eigenvalues of a symmetric operator relate to its eigenvectors?

The eigenvalues and eigenvectors of a symmetric operator are closely related. The eigenvalues determine the scaling factor of the eigenvectors, which are the vectors that remain unchanged under the transformation of the operator. The eigenvectors corresponding to the largest eigenvalues of a symmetric operator are also known as the principal components of the system.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
985
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
748
Replies
27
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top