Uniqueness of Inverse Matrices: Proof and Explanation

In summary: This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.
  • #1
ognik
643
2
I have an exercise which says to show that for vectors, $ A \cdot A^{-1} = A^{-1} \cdot A = I $ does NOT define $ A^{-1}$ uniquely.

But, let's assume there are at least 2 of $ A^{-1} = B, C$

Then $ A \cdot B = I = A \cdot C , \therefore BAB = BAC, \therefore B=C$, therefore $ A^{-1}$ is unique? (got lazy with dots)
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
ognik said:
I have an exercise which says to show that for vectors, $ A \cdot A^{-1} = A^{-1} \cdot A = I $ does NOT define $ A^{-1}$ uniquely.

But, let's assume there are at least 2 of $ A^{-1} = B, C$

Then $ A \cdot B = I = A \cdot C , \therefore BAB = BAC, \therefore B=C$, therefore $ A^{-1}$ is unique? (got lazy with dots)

Hi ognik, :)

I am not getting your question exactly. Do you mean that you have to show that $A^{-1}$ is unique? Could you please elaborate a bit.
 
  • #3
I'm not sure to be honest, I wonder if it is more that the identity does not define the inverse uniquely, ie there could be more than 1 inverse ...
 
  • #4
You'll have to be a bit more explicit in exactly what you're asking.

If, for a matrix $A$, there exists a matrix $B$ such that $AB = BA = I$, then $B$ is unique (invertible $n \times n$ matrices form a *group*, called the general linear group of degree $n$ over whatever field you're working with, and inverses in a group are unique).

However, if $B$ is merely a one-sided inverse, that is $AB = I$ or $BA = I$, but not both, $B$ may not be unique. This only happens when $A$ is non-square.

Furthermore, your question begins: "For vectors..." -it is hard to imagine what the (multiplicative) inverse of a vector might be.
 
  • #5
It seems to me that a multiplicative inverse of a vector, is a vector such that the dot-product is equal to $1$.
So for instance:
$$(^2_3) \cdot (^{1/2}_0) = 1$$
 
  • #6
I had what ILS suggests in mind, but I found the question hard to follow myself, hoping this is helpful - it is exactly as follows:

View attachment 4980
 

Attachments

  • vectorInverse.png
    vectorInverse.png
    1.8 KB · Views: 54
  • #7
As the dot product is commutative, the fact that $A\cdot A^{-1}=A^{-1}\cdot A$ is superfluous. Choose
$$A=\begin{bmatrix}1 \\ -1\end{bmatrix}.$$
Choose
$$B=\begin{bmatrix}1 \\ 0\end{bmatrix} \qquad \text{and} \qquad C=\begin{bmatrix}2 \\ 1\end{bmatrix}.$$
Then $A\cdot B=1$ and $A\cdot C=1$, but $B\not=C$.
 
  • #8
I see that thanks, but then what is wrong with my original 'proof'? It applies to matrices and a vector is a type of matrix?
 
  • #9
ognik said:
I see that thanks, but then what is wrong with my original 'proof'? It applies to matrices and a vector is a type of matrix?

I presume you're referring to $BAB=BAC∴B=C$.

It appears you're removing $BA$ from the left side.
But that would correspond to multiplying on the left with $(BA)^{-1}$.
This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.
 
  • #10
You may or may not know this, but any vector $v$ in a finite-dimensional inner product space determines (uniquely) a *hyperplane* (a subspace of dimension: $\dim(V) - 1$):

$E_v = \{w \in V: \langle v,w\rangle = 0\}$.

The vector $\dfrac{1}{\|v\|}v$ serves as a (unit) *normal* to $E_v$, if we choose an orientation for $V$, we have that either:

$\dfrac{1}{\|v\|}v$ or $-\dfrac{1}{\|v\|}v$ can be called *the* normal to the hyperplane $E_v$ (the "usual" orientation for $\Bbb R^3$ is set so that $\mathbf{i} \times \mathbf{j} = \mathbf{k}$, often called the "right-hand" orientation, since it corresponds to forming the (positive) $x$-axis with the right-hand index finger, the (positive) $y$-axis with the right-hand middle finger, and the thumb (pointing up), is the (positive) $z$-axis. This is a purely arbitrary convention, which is why cross-products are often called "pseudo-vectors", their sign isn't independent of axis orientation).

So...where was I?

Pick any vector in $E_v$, say, $w$, and consider $w + v$.

Then $\langle w+v,v\rangle = \langle w,v\rangle + \langle v,v\rangle = 0 + \|v\|^2 \neq 0$ (unless $v = 0$).

Thus $\langle\dfrac{1}{\|v\|^2}(w+v),v\rangle = 1$, and it is clear we have just as many such vectors as we have elements of $E_v$.

Why does your result not hold when $\dim(V) = 1$?
 
  • #11
I like Serena said:
I presume you're referring to $BAB=BAC∴B=C$.

It appears you're removing $BA$ from the left side.
But that would correspond to multiplying on the left with $(BA)^{-1}$.
This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.
That's a surprise, I thought we effectively had (BA)B = (BA)C, therefore B must = C?
 
  • #12
ognik said:
I see that thanks, but then what is wrong with my original 'proof'? It applies to matrices and a vector is a type of matrix?

I like Serena said:
I presume you're referring to $BAB=BAC∴B=C$.

It appears you're removing $BA$ from the left side.
But that would correspond to multiplying on the left with $(BA)^{-1}$.
This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.

I would also add that it's not at all clear that $BAB$ is well-defined. If the multiplication is a dot product, then the result of $A\cdot B$ is a number, not a vector; in that case, it's unclear what $B\cdot A \cdot B$ would even be. How would you define that?
 
  • #13
Ackbach said:
I would also add that it's not at all clear that $BAB$ is well-defined. If the multiplication is a dot product, then the result of $A\cdot B$ is a number, not a vector; in that case, it's unclear what $B\cdot A \cdot B$ would even be. How would you define that?
Indeed.

ognik said:
That's a surprise, I thought we effectively had (BA)B = (BA)C, therefore B must = C?
If $(BA)$ is for instance the zero matrix, we cannot conclude that $B=C$.

Just for fun:
$$a^2-a^2=a^2-a^2 \Rightarrow a(a-a) =(a+a)(a-a) \Rightarrow a=a+a \Rightarrow a=2a$$
This holds for any $a$, therefore $1=2$.
 
  • #14
I like Serena said:
Indeed.
If $(BA)$ is for instance the zero matrix, we cannot conclude that $B=C$.
.
I had postulated both B and C = $A^{-1}$ where we are a looking at A non-singular, but I need to remember that BA could be 0, even if B and A aren't.

Is $AA^{-1}=1$ by definition only?
 

Related to Uniqueness of Inverse Matrices: Proof and Explanation

What is an inverse matrix?

An inverse matrix is a matrix that, when multiplied by the original matrix, results in an identity matrix. It essentially undoes the effects of the original matrix.

Why is an inverse matrix important?

An inverse matrix is important because it allows us to solve systems of linear equations, which are commonly used in fields such as engineering, physics, and economics. It also has applications in computer graphics and cryptography.

How do you find the inverse of a matrix?

The inverse of a matrix can be found by using various methods, such as elementary row operations, Gaussian elimination, or matrix inversion algorithms. The specific method used depends on the size and complexity of the matrix.

Is there always an inverse matrix?

No, not all matrices have an inverse. Only square matrices (matrices with the same number of rows and columns) can have an inverse. Additionally, a square matrix must be non-singular (its determinant cannot be zero) in order to have an inverse.

Can an inverse matrix be unique?

Yes, an inverse matrix can be unique. If a square matrix has an inverse, it is unique and there is only one inverse matrix for that particular matrix. However, not all square matrices have an inverse, and some may have multiple inverses.

Similar threads

Replies
34
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
900
Replies
7
Views
881
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
614
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
841
Replies
3
Views
2K
Back
Top