Homework SolutionEigenvalue of a Matrix: Proof Involving Nonsingular Matrices

In summary, the conversation discussed a proof involving nonsingular matrices. The proof showed that if (I + A) is nonsingular, then (I - A)(I + A)^-1 = (I + A)^-1(I - A), and therefore (I - A)/(I + A) is defined for the matrix. However, there was some uncertainty about the proof as A was known to be skew-symmetric, but it was not clear if (I - A)^-1 exists.
  • #1
chocolatefrog
12
0
Proof involving nonsingular matrices.

Homework Statement



If (I + A) is nonsingular, prove that (I - A)(I + A)-1 = (I + A)-1(I - A), and hence (I - A)/(I + A) is defined for the matrix.

I've proved it like this:

Let (I - A)(I + A)-1 = A, and (I + A)-1(I - A) = B.
B-1 = (I - A)-1(I + A)
B-1A = I
Premultiplying by B, we get A = B.

Is this proof correct?
 
Last edited:
Physics news on Phys.org
  • #2


chocolatefrog said:

Homework Statement



If (I + A) is nonsingular, prove that (I - A)(I + A)-1 = (I + A)-1(I - A), and hence (I - A)/(I + A) is defined for the matrix.

I've proved it like this:

Let (I - A)(I + A)-1 = A, and (I + A)-1(I - A) = B.
B-1 = (I - A)-1(I + A)
B-1A = I
Premultiplying by B, we get A = B.

Is this proof correct?

You don't know that I-A is invertible. So (I-A)-1 might not exist.
 
  • #3


micromass said:
You don't know that I-A is invertible. So (I-A)-1 might not exist.

Oh, I forgot to mention that A is known to be skew-symmetric. So, (I - A)T = (I + A), which is nonsingular. And since a matrix is nonsingular iff its transpose is nonsingular, we could assume that (I - A)-1 exists.

I can't seem to think beyond this point. If there's still an error somewhere in the proof, could you please point to it?
 

Related to Homework SolutionEigenvalue of a Matrix: Proof Involving Nonsingular Matrices

What is an eigenvalue of a matrix?

An eigenvalue of a matrix is a special number that represents how a matrix transforms a vector. It is often denoted by the Greek letter lambda (λ).

How is an eigenvalue of a matrix calculated?

An eigenvalue of a matrix is calculated by solving the characteristic equation det(A-λI) = 0, where A is the matrix and I is the identity matrix. The solutions to this equation are the eigenvalues.

What is the significance of eigenvalues in linear algebra?

Eigenvalues are important in linear algebra because they provide information about the behavior of a matrix. They can determine whether a matrix is invertible, the stability of a system, and the direction of vector transformations.

Can a matrix have multiple eigenvalues?

Yes, a matrix can have multiple eigenvalues. The number of eigenvalues a matrix has is equal to its dimension. In other words, an n x n matrix will have n eigenvalues.

How are eigenvalues and eigenvectors related?

Eigenvalues and eigenvectors are closely related. An eigenvector is a vector that is transformed only by a scalar multiple when multiplied by a matrix. The corresponding eigenvalue is the scalar multiple that the eigenvector is transformed by. In other words, the eigenvector is the direction of the transformation and the eigenvalue is the magnitude of the transformation.

Similar threads

  • Precalculus Mathematics Homework Help
2
Replies
57
Views
3K
  • Precalculus Mathematics Homework Help
Replies
7
Views
2K
  • Precalculus Mathematics Homework Help
Replies
24
Views
2K
  • Precalculus Mathematics Homework Help
Replies
4
Views
613
  • Precalculus Mathematics Homework Help
Replies
18
Views
2K
  • Precalculus Mathematics Homework Help
Replies
32
Views
953
  • Precalculus Mathematics Homework Help
Replies
14
Views
2K
  • Precalculus Mathematics Homework Help
Replies
4
Views
978
  • Linear and Abstract Algebra
Replies
1
Views
855
  • Calculus and Beyond Homework Help
Replies
2
Views
447
Back
Top