# Best method for LU decomposition

#### Jameson

Staff member
Hi all,

I realize there might not be a "best method" but I want to ask if anyone has any improvements to the method taught in my class. I've looked at the Wikipedia page on this already.

Let's use the matrix $$\displaystyle \left( \begin{array}{ccc} 2 & -1 & 2 \\ -6 & 0 & -2 \\ 8 & -1 & 5 \end{array} \right)$$.

My teacher said to get the upper triangular matrix, $U$, we should simply find an echelon form of this matrix.

The quickest one for this appears to be $$\displaystyle \left( \begin{array}{ccc} 2 & -1 & 2 \\ 0 & -3 & 4 \\ 0 & 0 & 1 \end{array} \right)$$.

Now we need to find the lower triangular matrix, $L$. The way I was told to do this is by setting up another matrix in the following form: $$\displaystyle \left( \begin{array}{ccc} 1 & 0 & 0 \\ a & 1 & 0 \\ b & c & 1 \end{array} \right)$$.

Finally if we use the fact that $$\displaystyle \left( \begin{array}{ccc} 1 & 0 & 0 \\ a & 1 & 0 \\ b & c & 1 \end{array} \right) \left( \begin{array}{ccc} 2 & -1 & 2 \\ 0 & -3 & 4 \\ 0 & 0 & 1 \end{array} \right) = \left( \begin{array}{ccc} 2 & -1 & 2 \\ -6 & 0 & -2 \\ 8 & -1 & 5 \end{array} \right)$$

then we should be able to solve for $a,b,c$. Is there a better method that anything uses or is this one just fine?

Thank you!

#### Ackbach

##### Indicium Physicus
Staff member
Cholesky works faster, but is only applicable to Hermitian positive matrices. LU is used mostly for when you have multiple right-hand-sides of the equation $Ax=b$ to solve. If you have a single, straight equation $Ax=b$, then nothing, absolutely nothing, beats Gaussian elimination with back substitution (GEBS) for "exact" methods. I put exact in quotes because if very large systems, GEBS has worse round-off error than many of the iterative methods, so some of the iterative methods give you more accurate results. In particular, there are some very nice algorithms for sparse matrices that can be useful.

#### Jameson

Staff member
Cholesky works faster, but is only applicable to Hermitian positive matrices. LU is used mostly for when you have multiple right-hand-sides of the equation $Ax=b$ to solve. If you have a single, straight equation $Ax=b$, then nothing, absolutely nothing, beats Gaussian elimination with back substitution (GEBS) for "exact" methods. I put exact in quotes because if very large systems, GEBS has worse round-off error than many of the iterative methods, so some of the iterative methods give you more accurate results. In particular, there are some very nice algorithms for sparse matrices that can be useful.
We covered it with the explanation that if you have many equations in the form of $Ax=b$ for multiple $b$ then it's faster to use $x=A^{-1}b$ where $A$ is invertible and when it isn't or $A$ isn't a square matrix, then $LU$ decomposition can be employed. So if I have a non-square matrix then the method I outlined seems to be the fastest for decomposition, correct?

#### Ackbach

##### Indicium Physicus
Staff member
Well, I could be wrong, but I was under the impression that $LU$ is faster than $x=A^{-1}b$ even when (obviously) $A$ is invertible. I'd have to count the operations for solving the $L$ and $U$ part, and compare to matrix multiplication.

#### Jameson

Staff member
Well, I could be wrong, but I was under the impression that $LU$ is faster than $x=A^{-1}b$ even when (obviously) $A$ is invertible. I'd have to count the operations for solving the $L$ and $U$ part, and compare to matrix multiplication.
It very well might be. Didn't mean to imply that in my previous post, sorry. My main point was that when $A$ isn't invertible and you are solving for multiple $b$ then some kind of decomposition is necessary.

I'm more interested in the method of finding $L$ and $U$. On Wikipedia there are a couple of methods outlined and in my book there is a different one. I think among all of these I like the one in the OP best. Is that how you would find the $LU$ decomposition?