Eigenvalue and Eigenvector

In summary, eigenvalues and eigenvectors allow for diagonalization of matrices, simplifying calculations. Unique eigenvalues are not necessary, but linearly independent eigenvectors are. Eigenvectors are those on which the linear transformation acts like scalar multiplication. Determinants can be used to find eigenvalues, and there is a connection to the Kronecker Delta in Tensor Calculus. A matrix is diagonalizable if and only if its characteristic polynomial splits into linear factors and each root has a corresponding dimension in the kernel of the matrix.
  • #1
shankar
17
1
can anyone explain the the real meaning and purpose of eigen vlaue and eigen vectors..

:smile:
 
Mathematics news on Phys.org
  • #2
You know how easy it is to work with diagonal matrices, right?

Consider the fact that (nearly) every square matrix can, after a suitable change of basis, be written as a diagonal matrix whose entries are simply its eigenvalues.

So in one sense, using eigenvalues and eigenvectors let's you treat (nearly) any matrix similarly to a diagonal matrix, making the work easier.
 
  • #3
This works providing the matrix has unique eigenvalues, right? Do we need a full set of linearly independent eigenvectors?
 
  • #4
Actually, in order for a matrix to be diagonalizable, it is NOT necessary that all the eigenvalues be unique. It IS necessary that all the eigenvectors be independent- that is that there exist a basis for the vector space consisting of eigenvectors.
Essentially, the eigenvectors are those vectors on which the linear tranformation acts like simple scalar multiplication.
 
  • #5
If you have some square matrix then a non-zero vector x in R^n is an eigenvector of A if Ax is a scalar multiple of x. This scalar is called an eigenvalue of A.

Q) So if you have some vector then scalar multiples of it only 'stretches' or 'compresses' it by a factor of your eigenvalue? Explain. And how can we use determinants in finding eigenvalues of a given matrix?

(off-topic) Has this got anything to do with the Kronecker Delta in Tensor Calculus?
 
  • #6
Originally posted by Oxymoron
(off-topic) Has this got anything to do with the Kronecker Delta in Tensor Calculus?

Yes, it does.

When calculating the eigenvalues {λn} of a matrix A, you have to solve the equation:

det(A-λI)=0.

If we rewrite that in terms of matrix elements (IOW, with indices) we can write:

det(Aij-λIij),

the identity matrix Iij is none other than the Kronecker delta, δij.
 
  • #7
finding eigen vector for the matrix A, will it give the orthogonal quantity of the matrix..
 
  • #8
an n by n matrix M is diagonalizable if and only if the space R^n has a basis of eigenvectors of M, if and only if the minimal polynomial P of M consists of a product of different linear factors, if and only if the characteristic polynomial Q splits into a product of linear factors, and for each root c of Q, the kernel of M-cId has dimension equal to the power with which the factor (X-c) occurs in Q.

see http://www.math.uga.edu/~roy/
 

1. What are Eigenvalues and Eigenvectors?

Eigenvalues and eigenvectors are concepts in linear algebra that are used to understand linear transformations. An eigenvalue is a scalar value that represents how much a given vector is stretched or compressed when it is multiplied by a transformation matrix. An eigenvector is a vector that, when multiplied by a transformation matrix, results in a scalar multiple of itself, known as the eigenvalue.

2. How do you find Eigenvalues and Eigenvectors?

To find eigenvalues and eigenvectors, you must first create a square matrix representing the linear transformation. Then, you can use the characteristic equation (det(A - λI) = 0) to find the eigenvalues, where A is the transformation matrix and λ is the eigenvalue. Once the eigenvalues are found, you can use them to solve for the corresponding eigenvectors using the equation (A - λI)x = 0, where x is the eigenvector.

3. What is the significance of Eigenvalues and Eigenvectors?

Eigenvalues and eigenvectors are significant because they provide insight into how a linear transformation affects a given vector. They can also be used to simplify complex calculations and make it easier to understand the behavior of a system. Additionally, they are used in many real-world applications, such as image processing and data compression.

4. Can a matrix have more than one Eigenvalue and Eigenvector?

Yes, a matrix can have multiple eigenvalues and corresponding eigenvectors. In fact, for an n x n matrix, there can be up to n distinct eigenvalues. This means that there can also be multiple eigenvectors associated with each eigenvalue.

5. How are Eigenvalues and Eigenvectors used in data analysis?

Eigenvalues and eigenvectors are commonly used in data analysis to reduce the dimensionality of a dataset. This means that instead of representing data with a large number of variables, it can be represented with a smaller set of variables that capture the most important information. This is known as principal component analysis and is often used in fields such as machine learning and data mining.

Similar threads

Replies
2
Views
1K
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
410
Replies
2
Views
663
  • Advanced Physics Homework Help
Replies
1
Views
851
  • Advanced Physics Homework Help
Replies
17
Views
1K
  • Advanced Physics Homework Help
Replies
17
Views
1K
  • Advanced Physics Homework Help
Replies
13
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
266
  • General Math
Replies
4
Views
1K
Back
Top