- Thread starter
- #1

\(\displaystyle A^{-1}= \frac{1}{det(A)}adj(A)\)

A =

cos\(\displaystyle \theta\) | -sin\(\displaystyle \theta\) | 0 |

sin\(\displaystyle \theta\) | cos\(\displaystyle \theta\) | 0 |

0 | 0 | 1 |

----------

By cofactoring along the 3rd row, I find det(A) = (1)*(\(\displaystyle cos^2\theta + sin^2\theta\)) =1 , which is a nonzero and implies that A is invertible.

To get the Andjuct of A or Adj(A) , I form a cofactor matrix C and transpose it.

Adj(A) = \(\displaystyle C^{T}\) =

A =

cos\(\displaystyle \theta\) | sin\(\displaystyle \theta\) | 0 |

-sin\(\displaystyle \theta\) | cos\(\displaystyle \theta\) | 0 |

0 | 0 | 1 |

which also happens to be \(\displaystyle A^{-1}\)

My Question is how am I suppose to prove A is invertible for all Values of \(\displaystyle /theta\)?

My gut tells me I am suppose to state that \(\displaystyle A^{-1}= \frac{1}{det(A)}adj(A)\) does not depend on \(\displaystyle \theta\). Is there a more definitive way of showing A is invertible for all Values of \(\displaystyle \theta\)?