What's the utility of the eigenvectors of a matrix?

In summary, the eigenvectors of a matrix have many applications in physics, including finding simple forms of matrices, solving equations, and finding the basis for a matrix. In quantum mechanics, they also allow for the determination of quantum numbers and help in modeling physical theories. Additionally, they are used in rotation and symmetry operations, as well as in Noether's theorem. In general, eigenvectors are important in representing and understanding various physical properties and systems.
  • #1
meteor
940
0
What's the utility of the eigenvectors of a matrix?
I know that is something about quantum mechanics
 
Last edited by a moderator:
Mathematics news on Phys.org
  • #2
Then you know wrong.


It is true that, in one form of Quantum Mechanics, we can represent "operators" by (infinite dimensional) matrices and then all properties of matrices and linear operators are important.


One point of the "eigenvalue, eigenvector" problem is this:

If we can find a "complete set of eigenvectors" (a basis for the vector space consisting entirely of eigenvectors) then the matrix has a very simple form: it is diagonal with the eigenvalues on the diagonal.

For example, the eigenvalues of the matrix A= [0 1]
[-2 3] are 1 and 2

The vector [1 1] is an eigenvector with eigenvalue 2 and the vector
[1 2] is an eigenvector with eigenvalue 1.

Let P= [1 1]
[1 2] is a matrix with columns formed by those eigenvectors. It's inverse matrix is P-1= [2 -1]
[-1 1]
Now you can calculate yourself that
P-1A P= [1 0] and so P[1 0] P-1= A.
[0 2] [0 2]


Solving equations, etc. connected with [0 1]
[-1 3] can be reduced to problems with [1 0]
[0 2] which is much simpler.
 
  • #3
Eiegnvectors can also be used to find a basis for a matrix.

Fun stuff :smile:
 
  • #4


Originally posted by meteor
What's the utility of the eigenvectors of a matrix?
I know that is something about quantum mechanics

Start by thinking of a simple vector in 2-D. In general, it has two components. However, there is always a coordinate system in which one of the coordinates is zero (the system in which one axis is coincident with the vector).

Now think of a 3x3 matrix as representing something more complicated than a vector (it's called a tensor but that doesn't matter here). In general, the matrix will have 9 components (in 3-D), three of which are on the 'main diagonal' (top left to bottom right) and the other six of which are not. Again, there is always a coordinate system in which the 6 'off-diagonal' components are zero. In this system, the three components on the diagonal are the eigenvalues.

They have a whole range of applications. In vibration analysis, the eigenvalues are the natural frequencies (well, 'squared' but who cares), in buckling analysis the eigenvalues are the buckled shapes, in stress analysis the eigenvalues are the 'principal stresses' that control fracture, and so on.

More abstractly, multiplying a vector by an (orthogonal) matrix generally causes both rotation and dilation of the vector. However, certain matrices cause the vector to simply dilate, without rotating, i.e:

A.x = m.x

where A is a matrix, x is the vector and m is a scalar called the eigenvalue.
 
  • #5
The eigenvectors of the operators in QM allow you to obtain the quantum numbers of the system. If you consider for example the group SU(2), the eigenvalues of its Casimir operator on the irreducible representations allows you to obtain the quantum number. This elementary example led to the operator language in which the modern physical theories are modeled.
 
  • #6
Another simple example: If you have a rotation in space, then each vector that is along the axis, is an eigenvector. Since it's not affected by the rotation.
Similar to that, there is Noether's theorem which IIRC states that each symmetry generates an invariant. E.g. if a system is statical then it has constant energy.
(Just a rough sketch of what it has to do with physics. Some may say I'm totally wrong here...)
 
  • #7


Originally posted by rdt2
Now think of a 3x3 matrix as representing something more complicated than a vector (it's called a tensor but that doesn't matter here). In general, the matrix will have 9 components (in 3-D), three of which are on the 'main diagonal' (top left to bottom right) and the other six of which are not. Again, there is always a coordinate system in which the 6 'off-diagonal' components are zero. In this system, the three components on the diagonal are the eigenvalues.

This is not true! Not all matrices are diagonalizable!
A restriction to symmetric matrices would be more appropriate here as an illustration of the physical properties of eigenvalues...
 
  • #8


Originally posted by dg
This is not true! Not all matrices are diagonalizable!
A restriction to symmetric matrices would be more appropriate here as an illustration of the physical properties of eigenvalues...

Point taken! I seldom have to deal with non-symmetric matrices.

ron.
 
  • #9
Any normal M*M matrix is diagonalisable.
 
  • #10
In QM:
Eigenvectors of a matrix give the pure states for an observable. As the set of states is convex, this means that the states can be obtained from the eigenvectors.
 

What are eigenvectors and why are they important?

Eigenvectors are special vectors that do not change direction when multiplied by a matrix. They are important because they can help us understand the behavior of a matrix and its corresponding transformation.

How are eigenvectors used in data analysis?

Eigenvectors are used in data analysis to reduce the dimensionality of a dataset and extract important features. They can also be used for clustering and classification tasks.

What is the significance of the eigenvalues associated with eigenvectors?

The eigenvalues associated with eigenvectors represent the amount of stretching or shrinking that occurs in the direction of the eigenvector when the matrix is multiplied by it. They can also indicate the variance or importance of the corresponding eigenvector in the transformation.

Can eigenvectors be negative?

Yes, eigenvectors can have negative values. The sign of an eigenvector does not affect its direction or importance in the transformation.

How do eigenvectors relate to linear independence?

Eigenvectors are linearly independent, meaning they are not linear combinations of each other. This makes them useful for solving systems of linear equations and understanding the linear independence of a matrix.

Similar threads

Replies
2
Views
1K
Replies
1
Views
796
  • Linear and Abstract Algebra
Replies
24
Views
596
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
66
  • Linear and Abstract Algebra
Replies
10
Views
2K
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Replies
2
Views
702
  • Introductory Physics Homework Help
Replies
1
Views
686
Back
Top