# Extend to an orthonormal basis for R^3

#### Petrus

##### Well-known member
Hello MHB,
(I Hope the picture is read able)

this is a exemple on My book ( i am supposed to find a singular value decomposition) well My question is in the book when they use gram-Schmidt to extand they use $$\displaystyle (u_1,u_2,e_3)$$ but I would use $$\displaystyle (u_1,u_2,e_1)$$ cause it is orthogonal against u_2 which make the gram-Schmidt easy! does My method works as well?
Edit: I am pretty sure it works cause it is orthonormal to the other! Just want confirmed!
Regards,
$$\displaystyle |\pi\rangle$$

Last edited:

#### HallsofIvy

##### Well-known member
MHB Math Helper
Re: Extend to a orthonornal basis for R^3

When you are asked to derive an orthonormal set from a set of vectors, different orders will give different results but they will all be orthonormal sets. Any of those would be a correct answer. (Unless the order is specifically mentioned in the problem.)

#### Petrus

##### Well-known member
Re: Extend to a orthonornal basis for R^3

When you are asked to derive an orthonormal set from a set of vectors, different orders will give different results but they will all be orthonormal sets. Any of those would be a correct answer. (Unless the order is specifically mentioned in the problem.)
Thanks for the answer and thanks for taking your time! Have a nice day!

Regards,
$$\displaystyle |\pi\rangle$$

#### Deveno

##### Well-known member
MHB Math Scholar
Petrus, what you say is true, but in the case of a standard basis vector we have:

$\text{proj}_{\mathbf{v}}(\mathbf{e}_j) = \dfrac{\mathbf{v}\cdot\mathbf{e}_j} {\mathbf{v}\cdot \mathbf{v}}\mathbf{v}$

If $\mathbf{v}$ is already a unit vector, this becomes:

$v_j\mathbf{v}$.

So if we pick $\mathbf{v}_3 = \mathbf{e}_3$ (as your text does), then Gram-Schmidt gives:

$\mathbf{u}_3 = \mathbf{e}_3 - \text{proj}_{\mathbf{u}_1}(\mathbf{e}_3) - \text{proj}_{\mathbf{u}_2}(\mathbf{e}_3)$

$= (0,0,1) -\dfrac{1}{\sqrt{6}}\left(\dfrac{2}{\sqrt{6}}, \dfrac{1}{\sqrt{6}},\dfrac{1}{\sqrt{6}}\right) - \dfrac{1}{\sqrt{2}}\left(0,\dfrac{-1}{\sqrt{2}},\dfrac{1}{\sqrt{2}}\right)$

$= (0,0,1) - \left(\dfrac{1}{3},\dfrac{1}{6},\dfrac{1}{6}\right) - \left(0,\dfrac{-1}{2},\dfrac{1}{2}\right)$

$= \left(\dfrac{-1}{3},\dfrac{1}{3},\dfrac{1}{3}\right)$

which upon normalization clearly becomes the $\mathbf{u}_3$ in your text.

Yes, you are correct that if we pick $\mathbf{v}_3 = \mathbf{e}_1$, then one of the projection terms we subtract is 0, and we get:

$\mathbf{u}_3 = (1,0,0) - \dfrac{2}{\sqrt{6}}\left(\dfrac{2}{\sqrt{6}}, \dfrac{1}{\sqrt{6}},\dfrac{1}{\sqrt{6}}\right)$

$= (1,0,0) - \left(\dfrac{2}{3},\dfrac{1}{3},\dfrac{1}{3}\right)$

$= \left(\dfrac{1}{3},\dfrac{-1}{3},\dfrac{-1}{3}\right)$

This is the negative (upon normalization) of the vector your book found, and is clearly also perpendicular to the plane spanned by $\{\mathbf{u}_1,\mathbf{u}_2\}$.

Now it's largely a matter of preference as to which $U$ you use, one will be orientation-preserving, and one will be orientation-reversing. I'm a bit surprised your text chose the orientation-reversing matrix, but as you can see, both methods work out (up to a sign difference) the same (and it turns out the sign of the 3rd column of $U$ doesn't matter, because the 3rd row of $\Sigma$ is 0).

#### Petrus

##### Well-known member
Petrus, what you say is true, but in the case of a standard basis vector we have:

$\text{proj}_{\mathbf{v}}(\mathbf{e}_j) = \dfrac{\mathbf{v}\cdot\mathbf{e}_j} {\mathbf{v}\cdot \mathbf{v}}\mathbf{v}$

If $\mathbf{v}$ is already a unit vector, this becomes:

$v_j\mathbf{v}$.

So if we pick $\mathbf{v}_3 = \mathbf{e}_3$ (as your text does), then Gram-Schmidt gives:

$\mathbf{u}_3 = \mathbf{e}_3 - \text{proj}_{\mathbf{u}_1}(\mathbf{e}_3) - \text{proj}_{\mathbf{u}_2}(\mathbf{e}_3)$

$= (0,0,1) -\dfrac{1}{\sqrt{6}}\left(\dfrac{2}{\sqrt{6}}, \dfrac{1}{\sqrt{6}},\dfrac{1}{\sqrt{6}}\right) - \dfrac{1}{\sqrt{2}}\left(0,\dfrac{-1}{\sqrt{2}},\dfrac{1}{\sqrt{2}}\right)$

$= (0,0,1) - \left(\dfrac{1}{3},\dfrac{1}{6},\dfrac{1}{6}\right) - \left(0,\dfrac{-1}{2},\dfrac{1}{2}\right)$

$= \left(\dfrac{-1}{3},\dfrac{1}{3},\dfrac{1}{3}\right)$

which upon normalization clearly becomes the $\mathbf{u}_3$ in your text.

Yes, you are correct that if we pick $\mathbf{v}_3 = \mathbf{e}_1$, then one of the projection terms we subtract is 0, and we get:

$\mathbf{u}_3 = (1,0,0) - \dfrac{2}{\sqrt{6}}\left(\dfrac{2}{\sqrt{6}}, \dfrac{1}{\sqrt{6}},\dfrac{1}{\sqrt{6}}\right)$

$= (1,0,0) - \left(\dfrac{2}{3},\dfrac{1}{3},\dfrac{1}{3}\right)$

$= \left(\dfrac{1}{3},\dfrac{-1}{3},\dfrac{-1}{3}\right)$

This is the negative (upon normalization) of the vector your book found, and is clearly also perpendicular to the plane spanned by $\{\mathbf{u}_1,\mathbf{u}_2\}$.

Now it's largely a matter of preference as to which $U$ you use, one will be orientation-preserving, and one will be orientation-reversing. I'm a bit surprised your text chose the orientation-reversing matrix, but as you can see, both methods work out (up to a sign difference) the same (and it turns out the sign of the 3rd column of $U$ doesn't matter, because the 3rd row of $\Sigma$ is 0).
Thanks a LOT I always like your post! Thanks for taking your time and have a nice day!

Regards,
$$\displaystyle |\pi\rangle$$