Exploring Linear Combinations in Quantum Mechanics

I always use since I don't like inline TeX).The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.In summary, the conversation discusses the definition of linear combinations in the context of operators and Hilbert spaces. It is clarified that linear combinations
  • #1
SeM
Hi, I read that linear combinations of a state, Psi, can be as:

\begin{equation}
\Psi = \alpha \psi + \beta \psi
\end{equation}

where ##\alpha## and ##\beta## are arbitrary constants.

Can however this be a valid linear combination?\begin{equation}
\Psi = \alpha \psi \times \beta \psi
\end{equation}

Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
  • #3
Thanks. Is there a definition of this? Such as "linear ..." something?

Thanks
 
  • #5
No, It's really me who is writing it, and want to say that AA' = T and A'A = T, and both A and A' are linearily dependent with some other operators, so that makes T also dependent with some other operator, given that A and A' are a "linear ... " of T.
 
  • #6
SeM said:
No, It's really me who is writing it, and want to say that AA' = T and A'A = T, and both A and A' are linearily dependent with some other operators, so that makes T also dependent with some other operator, given that A and A' are a "linear ... " of T.
##AA'=A'A## means ##A## and ##A'## commute: ##[A,A']=0##. This is a non-linear property. You must not conclude any linear dependencies from this equation alone. E.g. take ##A= \begin{bmatrix}0&1&0\\0&0&0\\0&0&0\end{bmatrix}## and ##A'=\begin{bmatrix}0&0&0\\0&0&0\\0&0&1\end{bmatrix}##. They both multiply to ##T=0## from either side but are not linear dependent on anything (but ##0##).
 
  • Like
Likes SeM
  • #7
fresh_42 said:
##AA'=A'A## means ##A## and ##A'## commute: ##[A,A']=0##. This is a non-linear property. You must not conclude any linear dependencies from this equation alone. E.g. take ##A= \begin{bmatrix}0&1&0\\0&0&0\\0&0&0\end{bmatrix}## and ##A'=\begin{bmatrix}0&0&0\\0&0&0\\0&0&1\end{bmatrix}##. They both multiply to ##T=0## from either side but are not linear dependent on anything (but ##0##).
Thanks! Is this why xA'A = xAA' is the only allowed combination with another operator (x)?
 
  • #8
SeM said:
Thanks! Is this why xA'A = xAA' is the only allowed combination with another operator (x)?
No. However, I'm not quite sure what you mean. Usually small letters indicate vectors and capital letters matrices or operators in this context. As you had ##A'A=T=AA'## you surely also have ##xA'A=xT=xAA'## and ##A'Ax=Tx=AA'x## for all vectors ##x## and also ##XA'A=XT=XAA'## and ##A'AX=TX=AA'X## for any operator aka matrix ##X##. I got the impression also from your other threads, that you should relearn linear algebra again, since it seems you confuse fundamental concepts which are necessary to understand functional analysis. I have explained some basics about operators and such here:
https://www.physicsforums.com/insights/tell-operations-operators-functionals-representations-apart/
but the concept of linearity, subspaces and linear independency which is crucial to understand functional analysis is not explained. One could say that linear algebra is often finite dimensional and functional analysis infinite dimensional. Of course this is a bit very short and provocative, but there is some truth in it. So the basics of linear algebra are important since they occur everywhere. Linearity is addition and scalar multiplication (stretching and compressing vectors). What you wrote was ##AA'## which is a multiplication of mappings and thus not linear. It is linear in a single argument, because the distributive laws hold: ##A(A'+B')=AA'+AB'## and ##(A+B)A'=AA'+BA'##. But this requires one factor to be fixed. It is therefore called bilinear (i.e. linear in both arguments). As a whole, means as a multiplication in contrast to addition, ##AA'## is a non-linear concept.

This also answers your introductory question: ##\Psi = \alpha \psi + \beta \psi## is a linear combination (addition and stretching by ##\alpha , \beta ## resp.), whereas ##\Psi = \alpha \psi \times \beta \psi ## is none, because it is a multiplication.
 
  • Like
Likes SeM
  • #9
By the way ##\Psi = \alpha \psi + \beta \psi = (\alpha+\beta)\psi##. Anyway, why is this question in the differential geometry forum?
 
  • #10
martinbn said:
By the way ##\Psi = \alpha \psi + \beta \psi = (\alpha+\beta)\psi##. Anyway, why is this question in the differential geometry forum?
Good question. Moved to linear algebra.
 
  • #11
fresh_42 said:
It is therefore called bilinear (i.e. linear in both arguments). As a whole, means as a multiplication in contrast to addition, ##AA'## is a non-linear concept.

This also answers your introductory question: ##\Psi = \alpha \psi + \beta \psi## is a linear combination (addition and stretching by ##\alpha , \beta ## resp.), whereas ##\Psi = \alpha \psi \times \beta \psi ## is none, because it is a multiplication.
Thanks Fresh42 for the consistent outline on this. Indeed I am confused. This operator is a differential operator with a constant, and it happens twice, such as the factorized form of the operator H of the Schrödinger eqn.

So I was trying to say something about the two factorized components of some form of Hamitlonian, (similar to the Schrodinger H) and whether they were bounded/unbouded and/or linear. But it turns out they are neither bounded or unbounded, because they both yield complex norms. So they can be called complex operators simply. And if I have:

B = (ih d/dx + g) and B' = (-ihd/dx +g) they are in combination BB' = B'B but are NOT bilinear nor linear?

Thanks!
 
  • #12
SeM said:
Hi, I read that linear combinations of a state, Psi, can be as:

\begin{equation}
\Psi = \alpha \psi + \beta \psi
\end{equation}

where ##\alpha## and ##\beta## are arbitrary constants.

Can however this be a valid linear combination?\begin{equation}
\Psi = \alpha \psi \times \beta \psi
\end{equation}
The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.

Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.

Also, don't write $\alpha$ and similar -- this doesn't do anything on this site. Instead, use ##\alpha## (for inline TeX) or $$\alpha$$ (for standalone TeX).
 
  • Like
Likes SeM
  • #13
Mark44 said:
The term "linear combination" is one that is defined in just about every linear algebra textbook. It is used in conjunction with another concept, the span of a set of vectors. The linear combination concept also comes up in differential equations in which the set of all solutions of a homogeneous differential equation is the set of linear combinations of a basic set of solutions.

Since you are working with operators and Hilbert spaces and such (based on other threads), I would strongly advise you to review the more basic concepts of linear algebra and differential equations.

Also, don't write $\alpha$ and similar -- this doesn't do anything on this site. Instead, use ##\alpha## (for inline TeX) or $$\alpha$$ (for standalone TeX).

Thanks, will do!
 
  • #14
fresh_42 said:
I have read it today, and it was very useful and well written pedagogically. I learned a few things I was not aware of. Thanks. There is a typo however:

at :
"
A common example are matrix groups G and vector spaces V 8) where the operation is the application of the transformation represented by the matrix. A orthogonal matrix (representing a rotation) g=[cosφsinφ−sinφcosφ](3) transforms a two dimensional vector in its by φ rotated version.
"

it says "in its", which is missing some word after it, or it should say "in it"?

Cheers and thanks for the link
 
  • #15
Maybe it's not the best English, but it is correct: a vector is transformed into a new one. The new one is the same as the old one, just rotated by an angle of ##\varphi ##, so it's in a kind the by ##\varphi## rotated version of itself. So maybe in its own version which is rotated by ##\varphi## might have been better. I simply substituted the subordinated clause which is rotated by ##\varphi## by its prepositional phrase its by ##\varphi## rotated version and the prepositional genitive version of itself by its grammatical case its version.
 
  • #16
fresh_42 said:
Maybe it's not the best English, but it is correct: a vector is transformed into a new one. The new one is the same as the old one, just rotated by an angle of ##\varphi ##, so it's in a kind the by ##\varphi## rotated version of itself. So maybe in its own version which is rotated by ##\varphi## might have been better. I simply substituted the subordinated clause which is rotated by ##\varphi## by its prepositional phrase its by ##\varphi## rotated version and the prepositional genitive version of itself by its grammatical case its version.

I think the article was excellent anyway!

Thanks!
 

Related to Exploring Linear Combinations in Quantum Mechanics

1. What is a linear combination in quantum mechanics?

A linear combination in quantum mechanics refers to a mathematical operation where two or more wavefunctions are added together in order to create a new wavefunction. This allows for the exploration of different states and properties of a quantum system.

2. How do linear combinations relate to superposition in quantum mechanics?

Superposition is a key concept in quantum mechanics, which describes how a quantum system can exist in multiple states simultaneously. Linear combinations allow us to create and manipulate these superposition states by combining different wavefunctions.

3. What are the applications of exploring linear combinations in quantum mechanics?

Exploring linear combinations in quantum mechanics has a wide range of applications, including in quantum computing, quantum cryptography, and quantum sensing. Understanding and manipulating linear combinations also plays a crucial role in the development of new quantum technologies.

4. Can linear combinations be used to solve problems in quantum mechanics?

Yes, linear combinations are a fundamental tool in solving problems in quantum mechanics. They allow us to express complex wavefunctions in terms of simpler ones, making it easier to calculate and understand the behavior of quantum systems.

5. Are there any limitations to using linear combinations in quantum mechanics?

While linear combinations are a powerful tool in quantum mechanics, they do have limitations. For example, they may not accurately describe certain quantum phenomena such as entanglement, which requires a more sophisticated mathematical approach. Additionally, the number of wavefunctions that can be combined in a linear combination is limited by the size of the quantum system being studied.

Similar threads

Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
819
  • Quantum Physics
Replies
5
Views
515
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
958
  • Linear and Abstract Algebra
Replies
4
Views
967
  • Linear and Abstract Algebra
Replies
25
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
3K
  • Linear and Abstract Algebra
Replies
9
Views
900
  • Linear and Abstract Algebra
Replies
7
Views
959
Back
Top