Why is the Hamiltonian written as H|x> = E|x> instead of H|x> = |x>E?

  • Thread starter quietrain
  • Start date
  • Tags
    Matrix
In summary, the conversation discusses diagonalizable matrices and their properties. The speaker realizes that A * x is not equal to λ * x, but rather x * A = λ * x. They also question the notation for writing the Hamiltonian and ask about the role of orthogonal matrices in diagonalizing matrices. It is clarified that P^T * P = P * P^T = 1 for orthogonal matrices and that this property allows for easier diagonalization. The mistake of multiplying P with the diagonal matrix of eigenvalues is also pointed out.
  • #1
quietrain
655
2

Homework Statement



i have matrix A which is diagonalisable

by doing an example on wiki under the section " how do diagonalise a matrix"
http://en.wikipedia.org/wiki/Diagonalizable_matrix

i realize that

A x is not equal to [itex]\lambda[/itex] x , where x are eigen vectors of A , [itex]\lambda[/itex] is eigen values

instead

x A = [itex]\lambda[/itex] x

and i think

A x = x [itex]\lambda[/itex]

QUESTION 1)
if this is so, why do they always write the hamiltonian as H |x> = E |x> ? shouldn't it be H |x> = |x> E ?

if i remembered correctly, for matrix multiplication AB =/= BA right?

but i read wiki and it says something like (i can't remember the exact phrasing)

"it is equal if both A and B are diagonalisable matrix , and are both n by n matrix. "


QUESTION 2)
also, for PT A P = [itex]\lambda[/itex]

if i want to "bring over" the P, is it like this

A = P [itex]\lambda[/itex] PT

but why is it like this?



thanks!
 
Physics news on Phys.org
  • #2
You are right that matrices don't commute, however the multiplication of a matrix with a scalar (real or complex number) always commutes. So λx = xλ. What you write above is also not correct. Ax is not at all the same as xA, the wiki page is correct.

Question 2):

this works for so called orthogonal matrices where P^T * P = P*P^T = 1 (the identity matrix). It is not true for general matrices.
 
  • #3
You do not have to choose orthonormal vectors for eigenvalues but you can. If you do then the matrix P is "orthogonal"- in particular [itex]P^T= P^{-1}[/itex]. Without P being orthogonal, that is not true but it is still true that [itex]P^{-1}AP= D[/itex] so, multiplying on the left by P and on the right by [itex]P^{-1}[/itex],
[tex]P(P^{-1}AP)P^{-1}= PDP^{-1}[/tex]
[tex](PP^{-1})A(PP^{-1})= IAI= A= PDP^{-1}[/tex]

In the case that P is "orthogonal", You can replace [itex]P^{-1}[/itex] with [itex]P^T[/itex].
 
  • #4
oh shucks ... for question 1 i realize i multiplied the P ,matrix of eigen vectors, with the diagonal matrix eigen values

but in fact i think they are talking about the individual eigen value and eigen functions , not the combined P.

so yup i get it thanks everyolne!
 

Related to Why is the Hamiltonian written as H|x> = E|x> instead of H|x> = |x>E?

1. What is matrix diagonalisation?

Matrix diagonalisation is a process in linear algebra where a square matrix is transformed into a diagonal matrix by changing the basis of the matrix. This means that the new matrix has all non-zero elements only on the diagonal, and all other elements are zero. This process is useful for simplifying calculations and solving systems of equations.

2. What is the importance of matrix diagonalisation?

Matrix diagonalisation is important because it allows for easier computation and analysis of systems of linear equations. It also simplifies the process of finding the eigenvalues and eigenvectors of a matrix, which have many applications in fields such as physics, engineering, and computer science.

3. How is matrix diagonalisation performed?

Matrix diagonalisation is performed by finding the eigenvalues and eigenvectors of the original matrix, and then using them to construct a diagonal matrix. The diagonal elements of this new matrix are the eigenvalues, and the corresponding eigenvectors make up the columns of the matrix. This process can be done by hand or using computer software.

4. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important concepts in linear algebra that are used in matrix diagonalisation. Eigenvalues are the scalars that, when multiplied by a corresponding eigenvector, result in a scaled version of the original vector. Eigenvectors are the vectors that, when multiplied by a matrix, result in a scaled version of themselves. They are useful in understanding the behavior of linear transformations and systems of equations.

5. What are the applications of matrix diagonalisation?

Matrix diagonalisation has many applications in fields such as physics, engineering, and computer science. It is used for solving systems of linear equations, finding the eigenvalues and eigenvectors of a matrix, and simplifying calculations involving matrices. It is also used in data analysis and machine learning algorithms.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
732
  • Calculus and Beyond Homework Help
Replies
4
Views
908
  • Calculus and Beyond Homework Help
Replies
1
Views
593
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
566
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Replies
2
Views
718
Back
Top