- Thread starter
- #1

- Thread starter matqkks
- Start date

- Thread starter
- #1

- Feb 15, 2012

- 1,967

the idea is to enlarge our scope, and consider complex matrices instead (so that our eigenvalues are always in our underlying field). furthermore, we need to consider the complex "analogues" of orthogonal matrices, and symmetric matrices:

unitary matrices, and hermetian matrices.

to definite that, we need to define the analogue of the transpose for complex matrices: the conjugate-transpose:

$A^{\ast} = (\overline{a_{ji}})$ for $A = (a_{ij})$

so a unitary matrix is one for which:

$U^{\ast} = U^{-1}$

and a hermetian matrix is one for which:

$A^{\ast} = A$.

now, on to the business at hand. the nail we're going to use to crack this nut is this

for every square complex matrix A, there exists a *unitary* matrix U, with:

UAU* = UAU

we'll use induction on n. for a 1x1 matrix, this is trivial.

now we assume this is true for any (n-1)x(n-1) complex matrix. let A be an nxn complex matrix. now det(xI - A) is a complex polynomial of degree n, which has a root, $\lambda$, and thus A has an eigenvector, u belonging to $\lambda$ (there is always at least non-zero one solution to (A - λI) because it is singular). we can normalize this vector, giving a unit eigenvector, so we lose nothing by assuming u is a unit eigenvector to begin with.

using gram-schmidt, we can extend {u} first to a basis for $\Bbb C^n$ and then to an orthogonal basis for $\Bbb C^n$. the orthogonality of this basis (under the standard inner product for $\Bbb C^n$) means that if we form a matrix U

now note that U

thus in block form, we have:

$U_1^{-1}AU_1 = \begin{bmatrix}\lambda&*\\0&B \end{bmatrix}$

where B is an (n-1)x(n-1) matrix.

applying our induction hypothesis, there is an (n-1)x(n-1) unitary matrix U' such that:

U'

define U

$U_2 = \begin{bmatrix}1&0\\0&U' \end{bmatrix}$

then:

$U_2^{\ast} = \begin{bmatrix}1&0\\0&(U')^{\ast} \end{bmatrix}$

so $U_2U_2^{\ast} = \begin{bmatrix}1&0\\0&U'(U')^{\ast} \end{bmatrix} = I$

since U' is unitary, thus U

(U

$\begin{bmatrix}1&0\\0&(U')^{\ast} \end{bmatrix} \begin{bmatrix}\lambda&\ast \\0&B \end{bmatrix} \begin{bmatrix}1&0\\0&U' \end{bmatrix}$

$= \begin{bmatrix}\lambda&\ast \\0&(U')^{\ast}BU' \end{bmatrix} = \begin{bmatrix}1&\ast \\0&T' \end{bmatrix}$

which is upper-triangular, since T' is upper-triangular.

so let U = U

U

(the proof of the 4th equality is easy and analogous to the similar proof for the transpose, left to the interested reader).

this concludes the proof of the lemma (which actually does most of the heavy lifting for us). it's pretty much downhill from here.

observation #1: if the matrix A is hermetian, so is the upper-triangular matrix T which is unitarily similar to A.

why? because (U*AU)* = U*A*(U*)* = U*AU (since A* = A, if A is hermetian).

but...if T is upper-triangular, AND hermetian, it must be LOWER triangular, as well. which means when A is hermetian, T is DIAGONAL. furthermore, since the diagonal elements of T equal their own complex-conjugates, T is a REAL matrix.

observation #2: any real symmetric matrix is hermetian. thus any real symmetric matrix is unitarily similar to a real diagonal matrix, D.

observation #3: the eigenvalues of A are the eigenvalues of D, and vice-versa.

proof: det(xI - A) = det(U

observation #4: since the eigenvalues of A (a real symmetric matrix) are real, the eigenvectors are likewise real. thus we may take U to be a real unitary matrix, that is, an orthogonal one.

now suppose that a real, symmetric matrix A has an eigenvalue of (algebraic) multiplicity m. since A is orthogonally similar to a diagonal matrix D with the same eigenvalues (via an orthogonal matrix Q), D has m diagonal entries that are the same.

note that Q

suppose $\{u_{j_1},\dots,u_{j_m}\}$ are the columns of Q corresponding to the diagonal entry λ of D repeated m times (the eigenvalue of algebraic multiplicity m). then:

$A(u_{j_k}) = (u_{j_k})^TD = \lambda u_{j_k}\ k = 1,\dots,m$

giving m linearly independent eigenvectors of the eigenspace E

- Moderator
- #3

- Feb 7, 2012

- 2,793

This is a slight variation of Deveno's argument. I will assume you already know that the eigenvalues of a real symmetric matrix are all real.I have been trying to prove the following result:

If A is real symmetric matrix with an eigenvalue lambda of multiplicity m then lambda has m linearly independent e.vectors.

Is there a simple proof of this result?

Let $A$ be an $n\times n$ real symmetric matrix, and assume as an inductive hypothesis that all $(n-1)\times(n-1)$ real symmetric matrices are diagonalisable. Let $\lambda$ be an eigenvalue of $A$, with a normalised eigenvector $x_1$. Using the Gram–Schmidt process, form an orthonormal basis $\{x_1,x_2,\ldots,x_n\}$ with that eigenvector as its first element.

Let $P$ be the $n\times n$ matrix whose columns are $x_1,x_2,\ldots,x_n$, and denote by $T:\mathbb{R}^n \to \mathbb{R}^n$ the linear transformation whose matrix with respect to the standard basis is $A$. Then $P$ is an orthogonal matrix ($P^{\text{T}} = P^{-1}$), and the matrix of $T$ with respect to the basis $\{x_1,x_2,\ldots,x_n\}$ is $P^{\text{T}}AP.$ The $(i,j)$-element of that matrix is $(P^{\text{T}}AP)_{ij} = \langle Ax_j,x_i\rangle.$ In particular, the elements in the first column are $$(P^{\text{T}}AP)_{i1} = \langle Ax_1,x_i\rangle = \langle\lambda x_1,x_i\rangle = \begin{cases}\lambda&(i=1) \\0&(i>1)\end{cases}$$ (because the vectors $x_i$ are orthonormal). Thus the first column of $P^{\text{T}}AP$ has $\lambda$ as its top element , and $0$ for each of the other elements. Since $P^{\text{T}}AP$ is symmetric, the top row also consists of a $\lambda$ followed by all zeros. Hence the matrix $P^{\text{T}}AP$ looks like this: $$\begin{bmatrix}\lambda&0&\ldots&0 \\0 \\\vdots&&\large B \\ 0 \end{bmatrix},$$ where $B$ is an $(n-1)\times(n-1)$ real symmetric matrix. By the inductive hypothesis, $B$ is diagonalisable, so there is an orthogonal $(n-1)\times(n-1)$ matrix $Q$ such that $Q^{\text{T}}BQ$ is diagonal. Let $$R = \begin{bmatrix}1&0&\ldots&0 \\0 \\\vdots&&\large Q \\ 0 \end{bmatrix}.$$ Then $R^{\text{T}}P^{\text{T}}APR$ is diagonal, as required.