Repeated eigenvalues of a symmetric matrix

In summary: By induction, the smaller matrix $B = P^{\text{T}}AP-(\lambda I-\lambda P^{\text{T}}P)$ is diagonalisable, which means that there exists a matrix $Q$ such that $Q^{\text{T}}(P^{\text{T}}AP)Q$ is diagonal. But then $$(QP)^{\text{T}}A(QP) = Q^{\text{T}}P^{\text{T}}AQP = Q^{\text{T}}(P^{\text{T}}AP)Q$$ is also diagonal. The columns of $QP$ are eigenvectors of $A$, and the column space of $QP$ is all of $\mathbb
  • #1
matqkks
285
5
I have been trying to prove the following result:
If A is real symmetric matrix with an eigenvalue lambda of multiplicity m then lambda has m linearly independent e.vectors.
Is there a simple proof of this result?
 
Physics news on Phys.org
  • #2
what you are trying to do is essentially the same as proving a real symmetric matrix is diagonalizable. i know "a" proof of that, how "easy" it is depends on your perspective.

the idea is to enlarge our scope, and consider complex matrices instead (so that our eigenvalues are always in our underlying field). furthermore, we need to consider the complex "analogues" of orthogonal matrices, and symmetric matrices:

unitary matrices, and hermetian matrices.

to definite that, we need to define the analogue of the transpose for complex matrices: the conjugate-transpose:

$A^{\ast} = (\overline{a_{ji}})$ for $A = (a_{ij})$

so a unitary matrix is one for which:

$U^{\ast} = U^{-1}$

and a hermetian matrix is one for which:

$A^{\ast} = A$.

now, on to the business at hand. the nail we're going to use to crack this nut is this lemma:

for every square complex matrix A, there exists a *unitary* matrix U, with:

UAU* = UAU-1​ = T, where T is upper-triangular.

we'll use induction on n. for a 1x1 matrix, this is trivial.

now we assume this is true for any (n-1)x(n-1) complex matrix. let A be an nxn complex matrix. now det(xI - A) is a complex polynomial of degree n, which has a root, $\lambda$, and thus A has an eigenvector, u belonging to $\lambda$ (there is always at least non-zero one solution to (A - λI) because it is singular). we can normalize this vector, giving a unit eigenvector, so we lose nothing by assuming u is a unit eigenvector to begin with.

using gram-schmidt, we can extend {u} first to a basis for $\Bbb C^n$ and then to an orthogonal basis for $\Bbb C^n$. the orthogonality of this basis (under the standard inner product for $\Bbb C^n$) means that if we form a matrix U1 whose first column is u, and the other columns are the other orthogonal basis vectors, then this matrix is unitary. let e1 be the complex vector (1,0,...,0).

now note that U1-1AUe1 = U1Au = U1-1(λu) = λU1-1u = λe1.

thus in block form, we have:

$U_1^{-1}AU_1 = \begin{bmatrix}\lambda&*\\0&B \end{bmatrix}$

where B is an (n-1)x(n-1) matrix.

applying our induction hypothesis, there is an (n-1)x(n-1) unitary matrix U' such that:

U'-1BU = U'*BU = T', is upper-triangular.

define U2 to be the matrix (in block form):

$U_2 = \begin{bmatrix}1&0\\0&U' \end{bmatrix}$

then:

$U_2^{\ast} = \begin{bmatrix}1&0\\0&(U')^{\ast} \end{bmatrix}$

so $U_2U_2^{\ast} = \begin{bmatrix}1&0\\0&U'(U')^{\ast} \end{bmatrix} = I$

since U' is unitary, thus U2 is likewise unitary. ok, so now we have:

(U1U2)-1A(U1U2) = U2-1(U1-1AU1)U2 =

$\begin{bmatrix}1&0\\0&(U')^{\ast} \end{bmatrix} \begin{bmatrix}\lambda&\ast \\0&B \end{bmatrix} \begin{bmatrix}1&0\\0&U' \end{bmatrix}$

$= \begin{bmatrix}\lambda&\ast \\0&(U')^{\ast}BU' \end{bmatrix} = \begin{bmatrix}1&\ast \\0&T' \end{bmatrix}$

which is upper-triangular, since T' is upper-triangular.

so let U = U1U2. then this is the desired unitary matrix, since:

U-1 = (U1U2)-1 = U2-1U1-1 = U2*U1* = (U1U2)* = U*

(the proof of the 4th equality is easy and analogous to the similar proof for the transpose, left to the interested reader).

this concludes the proof of the lemma (which actually does most of the heavy lifting for us). it's pretty much downhill from here.

observation #1: if the matrix A is hermetian, so is the upper-triangular matrix T which is unitarily similar to A.

why? because (U*AU)* = U*A*(U*)* = U*AU (since A* = A, if A is hermetian).

but...if T is upper-triangular, AND hermetian, it must be LOWER triangular, as well. which means when A is hermetian, T is DIAGONAL. furthermore, since the diagonal elements of T equal their own complex-conjugates, T is a REAL matrix.

observation #2: any real symmetric matrix is hermetian. thus any real symmetric matrix is unitarily similar to a real diagonal matrix, D.

observation #3: the eigenvalues of A are the eigenvalues of D, and vice-versa.

proof: det(xI - A) = det(U-1(xI - A)U) = det(U-1(xI)U - U-1AU) = det(xI - D)

observation #4: since the eigenvalues of A (a real symmetric matrix) are real, the eigenvectors are likewise real. thus we may take U to be a real unitary matrix, that is, an orthogonal one.

now suppose that a real, symmetric matrix A has an eigenvalue of (algebraic) multiplicity m. since A is orthogonally similar to a diagonal matrix D with the same eigenvalues (via an orthogonal matrix Q), D has m diagonal entries that are the same.

note that Q-1AQ = D, means AQ = QD. since Q is invertible, its columns are linearly independent (it's square of full rank).

suppose $\{u_{j_1},\dots,u_{j_m}\}$ are the columns of Q corresponding to the diagonal entry λ of D repeated m times (the eigenvalue of algebraic multiplicity m). then:

$A(u_{j_k}) = (u_{j_k})^TD = \lambda u_{j_k}\ k = 1,\dots,m$

giving m linearly independent eigenvectors of the eigenspace Eλ.
 
  • #3
matqkks said:
I have been trying to prove the following result:
If A is real symmetric matrix with an eigenvalue lambda of multiplicity m then lambda has m linearly independent e.vectors.
Is there a simple proof of this result?
This is a slight variation of Deveno's argument. I will assume you already know that the eigenvalues of a real symmetric matrix are all real.

Let $A$ be an $n\times n$ real symmetric matrix, and assume as an inductive hypothesis that all $(n-1)\times(n-1)$ real symmetric matrices are diagonalisable. Let $\lambda$ be an eigenvalue of $A$, with a normalised eigenvector $x_1$. Using the Gram–Schmidt process, form an orthonormal basis $\{x_1,x_2,\ldots,x_n\}$ with that eigenvector as its first element.

Let $P$ be the $n\times n$ matrix whose columns are $x_1,x_2,\ldots,x_n$, and denote by $T:\mathbb{R}^n \to \mathbb{R}^n$ the linear transformation whose matrix with respect to the standard basis is $A$. Then $P$ is an orthogonal matrix ($P^{\text{T}} = P^{-1}$), and the matrix of $T$ with respect to the basis $\{x_1,x_2,\ldots,x_n\}$ is $P^{\text{T}}AP.$ The $(i,j)$-element of that matrix is $(P^{\text{T}}AP)_{ij} = \langle Ax_j,x_i\rangle.$ In particular, the elements in the first column are $$(P^{\text{T}}AP)_{i1} = \langle Ax_1,x_i\rangle = \langle\lambda x_1,x_i\rangle = \begin{cases}\lambda&(i=1) \\0&(i>1)\end{cases}$$ (because the vectors $x_i$ are orthonormal). Thus the first column of $P^{\text{T}}AP$ has $\lambda$ as its top element , and $0$ for each of the other elements. Since $P^{\text{T}}AP$ is symmetric, the top row also consists of a $\lambda$ followed by all zeros. Hence the matrix $P^{\text{T}}AP$ looks like this: $$\begin{bmatrix}\lambda&0&\ldots&0 \\0 \\\vdots&&\large B \\ 0 \end{bmatrix},$$ where $B$ is an $(n-1)\times(n-1)$ real symmetric matrix. By the inductive hypothesis, $B$ is diagonalisable, so there is an orthogonal $(n-1)\times(n-1)$ matrix $Q$ such that $Q^{\text{T}}BQ$ is diagonal. Let $$R = \begin{bmatrix}1&0&\ldots&0 \\0 \\\vdots&&\large Q \\ 0 \end{bmatrix}.$$ Then $R^{\text{T}}P^{\text{T}}APR$ is diagonal, as required.
 

Related to Repeated eigenvalues of a symmetric matrix

What is a repeated eigenvalue of a symmetric matrix?

A repeated eigenvalue of a symmetric matrix is an eigenvalue that appears more than once in the matrix's characteristic equation. This means that there is more than one corresponding eigenvector for that eigenvalue.

How do you find the repeated eigenvalues of a symmetric matrix?

To find the repeated eigenvalues of a symmetric matrix, you can either use the characteristic equation or use a computer program such as MATLAB. The characteristic equation involves finding the determinant of the matrix and setting it equal to 0. The resulting equation will have repeated roots, which are the repeated eigenvalues.

What is the significance of repeated eigenvalues in a symmetric matrix?

Repeated eigenvalues in a symmetric matrix indicate that the matrix has at least one eigenspace with dimension greater than 1. This means that there are multiple linearly independent eigenvectors associated with that eigenvalue, which can provide valuable information about the matrix's properties.

Can a symmetric matrix have repeated complex eigenvalues?

No, a symmetric matrix cannot have repeated complex eigenvalues. This is because complex eigenvalues always come in conjugate pairs, and a symmetric matrix can only have real eigenvalues. However, a symmetric matrix can have repeated real eigenvalues.

What is the relationship between the multiplicity of a repeated eigenvalue and its corresponding eigenspace?

The multiplicity of a repeated eigenvalue in a symmetric matrix is equal to the dimension of its corresponding eigenspace. This means that the number of linearly independent eigenvectors associated with a repeated eigenvalue is equal to the number of times that eigenvalue appears in the characteristic equation.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
834
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
13
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top