Welcome to our community

Be a part of something great, join today!

Self-adjoint transformation

smile

New member
Oct 15, 2013
19
Hello everyone

I hope someone can check the solution for me.

Here is the problem:
Let $V=V_1\oplus V_2$, $f$ is the projection of $V$ onto $V_1$ along $V_2$( i.e. if $v=v_1+v_2, v_i\in V_i$ then $f(v)=v_1$). Prove that $f$ is self-adjoint iff $<V_1,V_2>=0$

my solution is this:
proof:"$\Rightarrow$"let $v_1\in V_1, v_2\in V_2$,
then $<f(v_1),v_2>=<v_1,f*(v_2)>=<v_1,f(v_2)>$, since if $f$ is self-adjoint,
then $f(v_2)=0, f(v_1)=v_1$, it follows that $<v_1,v_2>=<v_1,0>=0$,
hence $<v_1,v_2>=0$

"$\Leftarrow$" let $v_1\in V_1, v_2\in V_2$,
$<f(v_1),v_2>=<v_1,v_2>=0$, since $<v_1,v_2>=0$
$<v_1,f(v_2)>=<v_1,0>=0$
hence $<f(v_1),v_2>=<v_1,f(v_2)>$, $f$ is self adjoint.

It seems like something is wrong with my proof, but I really don't know. Hope someone can check it.

Thanks
 

Opalg

MHB Oldtimer
Staff member
Feb 7, 2012
2,713
Hello everyone

I hope someone can check the solution for me.

Here is the problem:
Let $V=V_1\oplus V_2$, $f$ is the projection of $V$ onto $V_1$ along $V_2$( i.e. if $v=v_1+v_2, v_i\in V_i$ then $f(v)=v_1$). Prove that $f$ is self-adjoint iff $<V_1,V_2>=0$

my solution is this:
proof:"$\Rightarrow$"let $v_1\in V_1, v_2\in V_2$,
then $<f(v_1),v_2>=<v_1,f*(v_2)>=<v_1,f(v_2)>$, since if $f$ is self-adjoint,
then $f(v_2)=0, f(v_1)=v_1$, it follows that $<v_1,v_2>=<v_1,0>=0$,
hence $<v_1,v_2>=0$

"$\Leftarrow$" let $v_1\in V_1, v_2\in V_2$,
$<f(v_1),v_2>=<v_1,v_2>=0$, since $<v_1,v_2>=0$
$<v_1,f(v_2)>=<v_1,0>=0$
hence $<f(v_1),v_2>=<v_1,f(v_2)>$, $f$ is self adjoint.

It seems like something is wrong with my proof, but I really don't know. Hope someone can check it.
The "$\Rightarrow$" proof is fine. To show the converse implication, you need to take two vectors, $v_1\oplus v_2$ and $w_1\oplus w_2$ say, in $V_1\oplus V_2$, and check that $\langle f(v_1\oplus v_2),w_1\oplus w_2\rangle = \langle v_1\oplus v_2,f(w_1\oplus w_2)\rangle.$