Proving Self-Adjointness: Product of Operators on Inner-Product Space

  • Thread starter evilpostingmong
  • Start date
  • Tags
    Self
In summary, the question is asking for a proof or counterexample for the statement: "the product of any two self-adjoint operators on a finite-dimensional inner-product space is self-adjoint." After examining the definition of self-adjoint and using the given objects, it is shown that a counterexample exists, proving the statement to be false.
  • #1
evilpostingmong
339
0

Homework Statement


Prove or give a counterexample: the product of any two selfadjoint
operators on a finite-dimensional inner-product space is
self-adjoint.

Homework Equations


The Attempt at a Solution


I'd say that if we let a diagonal matrix represent T (after all, its transpose representing
T*=the matrix representing T) and multiply it by a diagonal matrix representing
the transformation S, then we'd end up with a diagonal matrix as a product.
So the product is self adjoint since all diagonal matrices are equal to their
transposes.
Another case is with an nxn matrix where all entries are equal. This matrix represets T and its
transpose is T*. Its matrix = its transpose so itself adjoint. Now multiplying it with another nxn matrix
representing S with all entries equal to each other would obviously produce a matrix with all entries
equal to each other. Or multiplying the matrix for T with a diagonal matrix would produce a diagonal matrix.
 
Last edited:
Physics news on Phys.org
  • #2
So what about,

[tex]
\left(\begin{matrix} 2 & 0 \\ 0 & 1 \end{matrix}\right) \left(\begin{matrix} 1 & 1 \\ 1& 1 \end{matrix}\right)
[/tex]

I must say I don't quite understand what you're trying to do. What is this transformation S, how is it relevant? Also there exist more matrices that are self adjoint than just diagonal matrices and matrices where every entry is the same. You're given two self adjoint operators on a finite-dimensional inner product space. You must prove that a product of two such operators is also self adjoint on that inner product space.

So try to answer these questions mathematically that is without using any words.

What is the definition of self adjoint?
What do you want to prove?
 
Last edited:
  • #3
Cyosis said:
So what about,

[tex]
\left(\begin{matrix} 2 & 0 \\ 0 & 1 \end{matrix}\right) \left(\begin{matrix} 1 & 1 \\ 1& 1 \end{matrix}\right)
[/tex]
I must say I don't quite understand what you're trying to do. What is this transformation S, how is it relevant? Also there exist more matrices that are self adjoint than just diagonal matrices and matrices where every entry is the same. You're given two self adjoint operators on a finite-dimensional inner product space. You must prove that a product of two such operators is also self adjoint on that inner product space.
So try to answer these questions mathematically that is without using any words.

What is the definition of self adjoint?
What do you want to prove?

Yeah self adjoint means Tv=T*v so T does the same to v as T* does.
You map from V to V to make this possible. I didn't really know that there were other matrices besides
the ones I mentioned that were self adjoint. Thats why I attempted the proof the way I did
since I thought those were the only types of matrices. I mean for T=T*, the matrix for T needs to equal
its transpose for T*, that's what I thought. I just started learning this stuff last night, so I guess I jumped
in too quickly. I thought it was easy as that.
 
Last edited:
  • #4
This doesn't really have anything to do with whether you just learned it or not. This has to do with your methodology. When you start a proof and are given a certain object, you use the definition of that object. You don't add something to it because you think that it may be true. If you want to add some extra property to an object you better prove that it is actually true before assuming it.

I will answer those two questions myself:
1) [itex]A=A^*[/itex]
2)if [itex]A=A^*[/itex] and [itex]B=B^* [/itex] then [itex]AB=(AB)^*[/itex]

You want to prove 2). However 2) is only true in very special cases you can see that by using the inner product. Now all we have used is what you're given we didn't invent any irrelevant stuff that only makes a proof more opaque and most likely wrong.

I mean for T=T*, the matrix for T needs to equal
its transpose for T*, that's what I thought.

Its conjugate transpose. For real valued matrices this is the same as the transpose.
 
  • #5
Cyosis said:
This doesn't really have anything to do with whether you just learned it or not. This has to do with your methodology. When you start a proof and are given a certain object, you use the definition of that object. You don't add something to it because you think that it may be true. If you want to add some extra property to an object you better prove that it is actually true before assuming it.

I will answer those two questions myself:
1) [itex]A=A^*[/itex]
2)if [itex]A=A^*[/itex] and [itex]B=B^* [/itex] then [itex]AB=(AB)^*[/itex]

You want to prove 2). However 2) is only true in very special cases you can see that by using the inner product. Now all we have used is what you're given we didn't invent any irrelevant stuff that only makes a proof more opaque and most likely wrong.



Its conjugate transpose. For real valued matrices this is the same as the transpose.

Ok thanks! So it isn't always true for the product of two matrices representing
self adjoints to be self adjoint by the example you gave in post 2.
 
  • #6
The problem is that (AB)*= B*A*= BA for if A and B are both self adjoint, not AB.

Saying that A is self-adjoint means that <Au, v>= <u, Av> for any u and v in the vector space. Now, <ABu, v>= <A(Bu), v>= <Bu, Av>= <u, BAv>, not <u. ABv>.
 
  • #7
Well yes one counter example is enough to discard a lemma. What worries me is that you managed to prove that it is true while it is not true.

Using the objects given by you, your proof should have lead you to,

Let A and B be self adjoint, we want to prove that AB is self adjoint as well, that is [itex]AB=(AB)^*[/itex].

Using the inner product:
[tex]<v,(AB)^*w>=<(AB)v,w>=<Bv,A^*w>=<v,B^{*}A^{*}w>=<v,(BA)w>[/tex]Therefore [itex] (AB)^*=BA\neq AB[/itex]. When does [itex]BA=AB[/itex] hold?
 
Last edited:
  • #8
evilpostingmong said:
Ok thanks! So it isn't always true for the product of two matrices representing
self adjoints to be self adjoint by the example you gave in post 2.

Right. That's exactly what the question is asking you to find out. You cannot assume you know the answer, you either have to find a counterexample to show that it is not true in some specific case, or make a logical argument (aka find a proof) to show that it must be true in all situations.
 

Related to Proving Self-Adjointness: Product of Operators on Inner-Product Space

1. What is self-adjointness?

Self-adjointness refers to a property of operators on an inner-product space, where the operator is equal to its own adjoint. This means that the operator and its adjoint have the same action on the vectors in the inner-product space.

2. How do you prove self-adjointness for a product of operators?

To prove self-adjointness for a product of operators, you must show that the product of the operators is equal to the product of their adjoints. This can be done by using the definition of adjoint operators and properties of inner-products.

3. Why is self-adjointness important?

Self-adjointness is important because it guarantees that the operator has real eigenvalues and orthogonal eigenvectors. This makes it easier to analyze and solve problems involving the operator, as well as providing important insights into the properties of the operator.

4. Can all operators on an inner-product space be proven to be self-adjoint?

No, not all operators on an inner-product space can be proven to be self-adjoint. Only operators that satisfy certain conditions, such as being linear and bounded, can be proven to be self-adjoint.

5. What are some applications of self-adjointness?

Self-adjoint operators have many applications in physics, engineering, and mathematics. They are used to solve differential equations, analyze quantum systems, and model physical phenomena. They also play a crucial role in the study of functional analysis and linear algebra.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
918
  • Linear and Abstract Algebra
Replies
2
Views
769
  • Linear and Abstract Algebra
Replies
3
Views
991
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
Replies
5
Views
1K
  • Quantum Physics
Replies
21
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
15
Views
921
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top