- Thread starter
- #1

#### find_the_fun

##### Active member

- Feb 1, 2012

- 166

- Thread starter find_the_fun
- Start date

- Thread starter
- #1

- Feb 1, 2012

- 166

- Admin
- #2

- Jan 26, 2012

- 4,202

However, this sort of inverse only works with square matrices, because you need $A^{-1}A$ to be the right size. Since a vector (aside from 1 x 1 matrices, also known as numbers) is not a square matrix, you cannot do this kind of inversion. The key here is that you're trying to achieve some sort of multiplicative identity, $I$ in this case. You can't do that with non-square matrices like vectors.

- Feb 15, 2012

- 1,967

First of all, for "division" to even make sense, you need some kind of multiplication, first. And this multiplication has to be of the form:

vector times vector = same kind of vector.

It turns out that this is only possible in certain dimensions: 1,2,4 (and if you allow certain "strangenesses" 8 and 16). This is a very "deep" theorem, due to Frobenius, and requires a bit of high-powered algebra to prove.

Now matrices only have such a multiplication when they are nxn (otherwise we get:

matrix times matrix = matrix of different size, which turns out to matter).

However, it turns out we can have "bad matrices", like so:

$AB = 0$ where neither $A$ nor $B$ are the 0-matrix. For example:

$A = \begin{bmatrix}1&0\\0&0 \end{bmatrix}$

$B = \begin{bmatrix}0&0\\0&1 \end{bmatrix}$

Now suppose, just for the sake of argument, we had a matrix we could call:

$\dfrac{1}{A}$.

Such a matrix should satisfy:

$\dfrac{1}{A}A = I$, the identity matrix.

Then:

$B = IB = \left(\dfrac{1}{A}A\right)B = \dfrac{1}{A}(AB) = \dfrac{1}{A}0 = 0$

which is a contradiction, since $B \neq 0$

In other words, "dividing by such a matrix" is rather like dividing by zero, it leads to nonsense.

It turns out the the condtition:

$AB = 0, A,B \neq 0$

is equivalent to:

$Av = 0$ for some vector $v \neq 0$.

Let's see why this is important by comparing matrix multiplication with scalar multiplication:

If $rA = rB$, we have:

$\dfrac{1}{r}(rA) = \left(\dfrac{1}{r}r\right)A = 1A = A$

and also:

$\dfrac{1}{r}(rA) = \dfrac{1}{r}(rB) = \left(\dfrac{1}{r}r\right)B = 1B = B$

provided $r \neq 0$ (which is almost every scalar).

This allows us to conclude $A = B$, in other words, the assignment:

$A \to rA$ is one-to-one.

However, if we take matrices:

$RA = RB$ does NOT imply $A = B$, for example let

$R = \begin{bmatrix} 1&0\\0&0 \end{bmatrix}$

$A = \begin{bmatrix} 0&0\\0&1 \end{bmatrix}$

$B = \begin{bmatrix} 0&0\\0&2 \end{bmatrix}$

Then we see, $RA = RB = 0$, but clearly $A$ and $B$ are different matrices.

So "left-multiplication by a matrix" is no longer 1-1, we means we can't uniquely "undo" it (which is what, at its heart, "division" is: the "un-doing" of multiplication).

I hope this made sense to you.

- Jan 30, 2012

- 2,547

And despite all this, one can divide by almost all square matrices of any dimension, and not just having 1, 2, 4, 8 or 16 components.The answer is: you CAN, but only in certain dimensions, under certain limited circumstances.

...

vector times vector = same kind of vector.

It turns out that this is only possible in certain dimensions: 1,2,4 (and if you allow certain "strangenesses" 8 and 16).

...

In other words, "dividing by such a matrix" is rather like dividing by zero, it leads to nonsense.

...

So "left-multiplication by a matrix" is no longer 1-1, we means we can't uniquely "undo" it (which is what, at its heart, "division" is: the "un-doing" of multiplication).

- Admin
- #5

- Mar 5, 2012

- 9,593

Regular division is denoted a÷b.

The same operator is used for the reciprocal: ÷b means 1/b.

Typically operations for matrices are denoted with a square block around the operator.

In particular the matrix inverse is ⌹B.

And matrix division is: A⌹B. This means A multiplied by the inverse of B.