Matrix Multiplication and Commuting

In summary, matrices commute when they are simultaneously diagonalizable, meaning that there exists an invertible matrix M that diagonalizes both matrices. This is equivalent to the eigenvalues and eigenvectors of both matrices being related in a specific way. When one matrix is not defective, it is easier to prove this relationship. However, if both matrices are defective, the proof becomes more complicated. An example of matrices that commute is when one matrix is a constant multiple of the other matrix. In this case, the eigenvalues differ by a constant factor and the matrices can be diagonalized simultaneously.
  • #1
Little Devil
Why might two matrices commute? I.e Why would AB=BA because in general, matrices usually do not commute. What are the properties of matrices that do commute?
Ben
 
Physics news on Phys.org
  • #2
Diagonal matrices commute.
 
  • #3
I forgot to mention there is a geometrical way of looking at this - suppose the diagonal is filled with scalars, then each aii stretches any object that the matrix acts on in the i direction. For example, suppose a11 is 5, then any object acted on by the matrix will be streched by a factor of 5 in the x direction, if you are using an x,y,z coordinate system. In a diagonal matrix the eigenvalues are precisely the entries on the diagonal.
 
  • #4
I completely left out the geometric intuition I sought to express - stretching in the x direction, then the y, is the same as stretching in the y direction, then the x - the two operations commute, and so do any matrices associated with the operations.
 
  • #5
More generally: two matrices, A and B, commute if and only if they are "simultaneously diagonalizable": that is if there exist some invertible matrix M such that MAM-1= D1 and MBM-1= D2 where D1 and D2 are diagonal matrices (more generally, Jordan Normal Form if A and B are not diagonalizable).

It follows then that A= M-1D1M and B= M-1D2M.
Then AB= M-1D1MM-1D2M= M-1D1D2M
= M-1D2D1M (since all diagonal matrices commute)
= M-1D2MM-1D2M
= BA.
 
  • #6
That is the proof I couldn't remember.
 
  • #7
How does the proof in the other direction go? I fiddled around with it a bit and wasn't getting very far.
 
  • #8
It wasn't a proof, just geometric intuition, given by the fact that it doesn't matter which order something is stretched in - the final result will be the same, and the fact that diagonal matrices stretch any object they act on, by the degree given by the ith eigenvalue, in the ith direction. There is an excellent geometric discussion of how eigenvalues and eigenvectors act on an object available here:
http://hverrill.net/courses/linalg/linalg8.html
I don't have anything better to add to it.
 
  • #9
Ok... I can prove that if A has not defective then A and B commute iff they are simultaneously diagonalizable. Anyone know how to do it when both are defective?


Here's a sketch of what I have so far:

Assume A is not defective and finite dimensional. Then choose a basis that diagonalizes A.

The i,k-th entry in AB is AiiBik
The i,k-th entry in BA is BikAkk

So AB = BA iff, for all (i, k), Bik = 0 or Aii=Akk.


If Aii = Ajj, then A is the identity over the subspace spanned by the i-th and j-th basis vectors, so we can replace these two vectors with basis vectors that diagonalize B over that subspace.

By repeating this process, we can produce a basis that simultaneously diagonalizes both A and B.


I don't know what to do if A is defective or infinite dimensional. (though I admit not having taken a crack at modifying the above proof to use transfinite induction to tackle the infinite dimensional case) :frown:
 
  • #10
That proof should work for infinite dimensional cases as is because it already uses induction in the repetition of converting the basis vectors to basis vectors that span subspaces of B, two by two, as long as A is not defective. Proof if both A and B are defective is not easy. Good luck.
 
  • #11
rick1138 said:
Diagonal matrices commute.


this is not IFF is it?

A * B can B* A without A being diagonal
 
  • #12
Can you give me and example to find matrices commute with matrices A that we know?
 
  • #13
Tola said:
Can you give me and example to find matrices commute with matrices A that we know?

Clearly an example is all matrices for which B = kA, where k is a scalar. Then, the eigenvalues differ by a constant factor and the matrix M diagonalizes both A and B as required.
 

Matrix Multiplication and Commuting

Q1: What is Matrix Multiplication?

Matrix multiplication is an operation used to combine two matrices to produce a third matrix. It is a fundamental operation in linear algebra and is defined for matrices where the number of columns in the first matrix matches the number of rows in the second matrix.

Q2: How is Matrix Multiplication Performed?

To multiply two matrices A and B, you take the dot product of rows from matrix A with columns from matrix B. The result is a new matrix C, where each element C[i][j] is computed as the sum of the products of elements from row i of A and column j of B.

Q3: What Does it Mean for Matrices to "Commute"?

Two matrices, A and B, are said to "commute" if their product AB is equal to the product BA. In other words, if AB = BA, then A and B commute.

Q4: Do All Matrices Commute?

No, not all matrices commute. Whether matrices commute or not depends on their specific properties and elements. In general, matrix multiplication is not commutative, meaning that AB is not necessarily equal to BA for arbitrary matrices A and B.

Q5: Are There Special Cases Where Matrices Do Commute?

Yes, there are special cases where matrices commute. Some matrices, particularly diagonal matrices (matrices where all off-diagonal elements are zero), commute with many other matrices. Additionally, the identity matrix (a special diagonal matrix) commutes with any matrix.

Q6: What Are the Implications of Matrices That Do Commute?

When matrices commute (AB = BA), it simplifies certain mathematical operations and computations involving those matrices. Commuting matrices often have special properties that can be useful in linear algebra and applications in physics, engineering, and other fields.

Q7: Are There Applications of Matrix Commutation in Real-World Problems?

Yes, matrix commutation has applications in various fields, including quantum mechanics, quantum computing, physics, and control theory. In quantum mechanics, for example, operators representing physical observables often need to commute to have simultaneous eigenstates, which is crucial in quantum physics calculations.

Q8: How Can I Learn More About Matrix Multiplication and Commutation?

You can learn more about matrix multiplication and commutation by studying linear algebra textbooks, taking courses in linear algebra and matrix theory, and exploring online resources and tutorials dedicated to these topics. Practical exercises and problem-solving can enhance your understanding.

Similar threads

Replies
24
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
621
  • Linear and Abstract Algebra
Replies
5
Views
1K
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Topology and Analysis
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
504
  • Linear and Abstract Algebra
Replies
5
Views
878
  • Linear and Abstract Algebra
Replies
9
Views
1K
Back
Top