Is identity matrix basis dependent?

In summary, these two matrices are both identity matrices, but one of them would become diagonal with the right change of basis.
  • #1
zonde
Gold Member
2,961
224
To me it seems basic question or even obvious but as I am not mathematician I would rather like to check.
Is it true that these two matrices are both identity matrices: ##\begin{pmatrix}1&0\\0&1\end{pmatrix} ## and ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}##? It's just that one of them would become diagonal with the right change of basis and the diagonal one would not be diagonal any more with that change of basis, right?
 
Physics news on Phys.org
  • #2
No. Only the first is an identity matrix. The second is a rotation matrix.
 
  • #3
As soon as you write a matrix and interpret it as a linear function, you already have fixed a basis! Two, to be exact.
 
  • Like
Likes FactChecker
  • #4
zonde said:
To me it seems basic question or even obvious but as I am not mathematician I would rather like to check.
Is it true that these two matrices are both identity matrices: ##\begin{pmatrix}1&0\\0&1\end{pmatrix} ## and ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}##? It's just that one of them would become diagonal with the right change of basis and the diagonal one would not be diagonal any more with that change of basis, right?

Your question is based on a fundamental misunderstanding. The set of 2x2 matrices is a well-defined ring. It has a multiplicative identity element, which is:

##\begin{pmatrix}1&0\\0&1\end{pmatrix} ##

Now, the set of linear transformations of ##\mathbb{R}^2## is also a ring. And, is isomorphic to the ring of 2x2 matrices. Given any basis for ##\mathbb{R}^2##, every linear transformation maps to a specific 2x2 matrix. The mapping depends on the basis. But (as with all ring isomorphisms) the identity linear transformation must map to the identity matrix. In other words, the identity linear transformation is always represented by the identity matrix.
 
  • Like
Likes WWGD, FactChecker, Igael and 1 other person
  • #5
Thanks Orodruin, fresh_42 and PeroK for your answers.
I suppose my misunderstanding comes from thinking about matrices as a kind of set of vectors. For now I will try to absorb what you said.
 
  • #6
In order for an arbitrary matrix A to be an identity matrix we must have the condition that for all vectors x

Ax = xA = x.

The first is clearly a 2x2 identity matrix. the second isn't since Ax neq x
 
  • #7
Your equations do not make any sense whatsoever if x is not a square matrix of the same dimension as A.
 
  • #8
Orodruin said:
Your equations do not make any sense whatsoever if x is not a square matrix of the same dimension as A.

True I should have specified that x must be of correct dimension such that multiplication by A is possible in which x would have to be a square 2x2 matrix in order for the commutative law to be applicable as show above. i.e.if A is a identity matrix then

Ax = xA = x iff dimensions of A= dimensions of x.

Thanks for adding the clarification regarding the dimensions of x.
 
  • #9
"... the diagonal one would not be diagonal any more with that change of basis, right?"

Interesting question raised here.

As you may know: Say that for some vector space V with dim(V) = n has a specified basis A = {a1, ..., an}. And suppose we have a linear transformation

L: V → V​

that is represented by a square n x n matrix M acting on column vectors placed to its right. So that the kth column of M, which is the transpose of (mk1, ..., mkn), represents the vector

L(ak) = mk1 a1 + ... + mkn an

(Note that it is necessary to specify these things before we can make sense of questions about matrices, linear transformations, and change-of-basis matrices.)

Now suppose B = {b1, ..., bn} is another basis for V, and that we would now like to re-express the same linear transformation L in terms of this new basis.

Question 1: How do we do this? Answer: First we need to write each basis vector of the new basis B as a linear combination of the basis vectors of the old basis A. There is always some way to do this. That will give n linear coefficients of the A vectors for each one of the n B vectors, and these need to be arranged in a matrix: Call it CAB. (Sub-question: Exactly how should these n2 numbers be arranged?) Now assume CAB-1 is the inverse of the matrix CAB.

Then with respect to the new basis B, the linear transformation L can be expressed as the product of the three matrices:

CAB-1 M CAB

in that order. Now — if the matrix CAB is arranged correctly, we can apply this matrix to a column vector denoting an element v of V expressed with respect to the new basis B, and the result will be the column vector L(v) that is also expressed in terms of the new basis B.

Question 2: Suppose D is any diagonal n x n matrix such that there exists some n x n invertible matrix C with the property that

C-1 D C​

is not diagonal. What can be said about the matrix D ?

(Perhaps it's easier to think of the complementary question: Suppose D has the complementary property that for every invertible n x n matrix C, we have that

C-1 D C​

is again a diagonal matrix. For which diagonal matrices D is this true for?)
 
  • #10
zonde said:
To me it seems basic question or even obvious but as I am not mathematician I would rather like to check.
Is it true that these two matrices are both identity matrices: ##\begin{pmatrix}1&0\\0&1\end{pmatrix} ## and ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}##? It's just that one of them would become diagonal with the right change of basis and the diagonal one would not be diagonal any more with that change of basis, right?
If you think of the matrix as two row vectors written out in terms of a basis then rewriting the rows of the second matrix in terms of the two vectors in the second matrix would give you the identity back again. The first matrix, though, would no longer be the identity matrix.

If you think of a matrix as describing a linear transformation , then the identity transformation will be represented by the identity matrix in every basis:this because the identity is the identity on every vector.
 
  • #11
lavinia said:
If you think of the matrix as two row vectors written out in terms of a basis then rewriting the rows of the second matrix in terms of the two vectors in the second matrix would give you the identity back again. The first matrix, though, would no longer be the identity matrix.

If you think of a matrix as describing a linear transformation , then the identity transformation will be represented by the identity matrix in every basis:this because the identity is the identity on every vector.
Thanks for your answer. I got the idea that taking these matrices as transformation matrices there is only one identity matrix. I have a feeling that in context where I came across them, they where not exactly transformation matrices (it's density matrices in quantum mechanics). Rather they describe averages of squares of eigenvectors over a set of complex vectors. Well, something like that.
 
  • #12
Consider a change of basis of the Id (in the basis, representation you are using, or at least, assume that the ID matrix you use is described by the matrix you used), into a matrix I', through the use of a matrix B. Then:

## I'=BIB^{-1} =IBB^{-1}=I ##

I hope I did not misunderstand your question.
 

Related to Is identity matrix basis dependent?

What is an identity matrix?

An identity matrix is a square matrix in which all elements on the main diagonal are equal to 1 and all other elements are equal to 0.

What does it mean for a matrix to be basis dependent?

A matrix is said to be basis dependent if it can be expressed as a linear combination of the basis vectors of a vector space.

How do you determine if a matrix is basis dependent?

A matrix is basis dependent if its determinant is equal to 0. This means that at least one of its rows or columns is a linear combination of the other rows or columns.

Is an identity matrix always basis dependent?

No, an identity matrix is never basis dependent. This is because its determinant is always equal to 1, which means that it cannot be expressed as a linear combination of the basis vectors.

Why is it important to know if a matrix is basis dependent?

Knowing if a matrix is basis dependent can help in determining if a vector space is spanned by the matrix or if it is linearly dependent. It also has applications in solving systems of equations and in understanding transformations in linear algebra.

Similar threads

  • Linear and Abstract Algebra
Replies
34
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
710
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
Replies
31
Views
2K
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top