- #1
Boorglar
- 210
- 10
In my last Linear Algebra class we saw Eigenvalues and Diagonalizations. It turns out that an n x n matrix is diagonalizable if its eigenbasis has n linearly independent vectors.
If the characteristic equation for the matrix is [itex] (λ - λ_1)^{e_1}(λ - λ_2)^{e_2}...(λ - λ_k)^{e_k} = 0 [/itex] then 1) eigenspaces corresponding to different eigenvalues are linearly independent and 2) the dimension of the eigenspace corresponding to [itex]λ_i ≤ e_i [/itex]. That is, the algebraic multiplicity of the eigenvalue is greater than or equal to the geometric multiplicity.
But is there a general criterion for telling when algebraic and geometric multiplicities are equal? Given a concrete matrix, I can use Gaussian elimination to solve for the eigenvectors, but is there a more general way of dealing with this problem simply by looking at the form of the matrix?
EDIT: If I remember, there was something about real symmetric matrices but I forgot some of the theorems.
If the characteristic equation for the matrix is [itex] (λ - λ_1)^{e_1}(λ - λ_2)^{e_2}...(λ - λ_k)^{e_k} = 0 [/itex] then 1) eigenspaces corresponding to different eigenvalues are linearly independent and 2) the dimension of the eigenspace corresponding to [itex]λ_i ≤ e_i [/itex]. That is, the algebraic multiplicity of the eigenvalue is greater than or equal to the geometric multiplicity.
But is there a general criterion for telling when algebraic and geometric multiplicities are equal? Given a concrete matrix, I can use Gaussian elimination to solve for the eigenvectors, but is there a more general way of dealing with this problem simply by looking at the form of the matrix?
EDIT: If I remember, there was something about real symmetric matrices but I forgot some of the theorems.
Last edited: