Does Commutativity Affect Linearity?

In summary: Thanks for this Fresh!##\gamma## is a real physical constant. However, you mention ##\gamma \cdot \operatorname{id} = \gamma \cdot I## does it mean that ##\gamma## should be a complex number in order to make T linear? Also, what does "id" stand for in this context?In summary, the conversation discusses the case where two operators, T and T', commute with each other and their commutator is zero. This allows for them to be measured at the same time. However, the question arises about whether these operators are linear or non-linear. It is argued that the linearity of T depends on how the constant, represented by ##\gamma##, is interpreted. If
  • #1
SemM
Gold Member
195
13
Hi, I have in a previous thread discussed the case where:

\begin{equation}
TT' = T'T
\end{equation}

and someone, said that this was a case of non-linear operators. Evidently, they commute, so their commutator is zero and therefore they can be measured at the same time. What makes them however non-linear?

Thanks!
 
Physics news on Phys.org
  • #2
SemM said:
Hi, I have in a previous thread discussed the case where:

\begin{equation}
TT' = T'T
\end{equation}

and someone, said that this was a case of non-linear operators. Evidently, they commute, so their commutator is zero and therefore they can be measured at the same time. What makes them however non-linear?

Thanks!
First you tell me what ##T## is, and then I'll tell you whether it is linear or not, not the other way around. E.g. if ##T\; , \;T'## stand for matrices, then they are linear. The fact that they commute doesn't allow any conclusions besides that they commute. To say they are linear or non-linear just by the equation ##TT'=T'T## is nonsense, except they have a fixed and one and only meaning which I'm not aware of.
 
  • Like
Likes StoneTemplePython and SemM
  • #3
fresh_42 said:
First you tell me what ##T## is, and then I'll tell you whether it is linear or not, not the other way around. E.g. if ##T\; , \;T'## stand for matrices, then they are linear. The fact that they commute doesn't allow any conclusions besides that they commute. To say they are linear or non-linear just by the equation ##TT'=T'T## is nonsense, except they have a fixed and one and only meaning which I'm not aware of.

Thanks for that, I am not going to point out who said what in other posts, and get right to the point:

##T = \bigg(i\hbar \frac{d}{dx} + \gamma\bigg)##
##T' = \bigg(-i\hbar \frac{d}{dx} + \gamma\bigg)##

where ##\gamma## is a constant.

and [TT'] = [T'T]About linearity, I am not sure how to prove it.

However, to prove another property, Hermiticity, the proof of that these are Hermitian is not made. I am going to perform this on T:

http://www.colby.edu/chemistry/PChem/notes/MomentumHermitian.pdf

to check that.

Thanks
 
Last edited:
  • #4
SemM said:
Thanks for that, I am not going to point out who said what in other posts, and get right to the point:

##T = i\hbar d/dx - \gamma##
##T' = -i\hbar d/dx - \gamma##

where ##\gamma## is a constant.
In this case, ##\gamma## makes it non-linear, except if ## \gamma = 0##.
$$T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) - \gamma = \left( {i}{\hbar} \dfrac{d}{dx} f \right) + \left( {i}{\hbar} \dfrac{d}{dx} g \right) - \gamma $$ but $$T(f)+T(g) = \left( {i}{\hbar} \dfrac{d}{dx} f \right) + \left( {i}{\hbar} \dfrac{d}{dx} g \right) - 2\gamma $$ I learned to call it affine-linear, because it is still flat as a straight, just not through the origin and therefore ##T(0) = \gamma \stackrel{i.g.}{\neq} 0## which makes it non-linear.

Can you prove why linearity of (an arbitrary) ##T## would imply ##T(0)=0## ?

Edit: ##\dfrac{i}{\hbar} \longrightarrow i \hbar ## corrected.
 
Last edited:
  • Like
Likes FactChecker and SemM
  • #5
fresh_42 said:
In this case, ##\gamma## makes it non-linear, except if ## \gamma = 0##.
$$T(f+g) = \dfrac{i}{\hbar} \dfrac{d}{dx} (f+g) - \gamma = \left( \dfrac{i}{\hbar} \dfrac{d}{dx} f \right) + \left( \dfrac{i}{\hbar} \dfrac{d}{dx} g \right) - \gamma $$ but $$T(f)+T(g) = \left( \dfrac{i}{\hbar} \dfrac{d}{dx} f \right) + \left( \dfrac{i}{\hbar} \dfrac{d}{dx} g \right) - 2\gamma $$ I learned to call it affine-linear, because it is still flat as a straight, just not through the origin and therefore ##T(0) = \gamma \stackrel{i.g.}{\neq} 0## which makes it non-linear.

Can you prove why linearity of (an arbitrary) ##T## would imply ##T(0)=0## ?
Fantastic, let me try to answer this tomorrow! Thanks Fresh!
 
  • #6
One further remark:
In case ##\gamma ## is actually ##\gamma \cdot \operatorname{id} = \gamma \cdot I## the operator ##T## becomes linear, as the sum of two linear operators is linear again. In this case we have:
$$
T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) + \gamma \cdot I (f+g)= {i}{\hbar} \dfrac{d}{dx}f + \gamma \cdot f + {i}{\hbar} \dfrac{d}{dx}g + \gamma \cdot g = T(f)+T(g)
$$
so all depends on how to read ##\gamma ##, as a constant translation "plus ##\gamma ##" (## \gamma . f = + \gamma ## which is non-linear) or as the constant linear operator "times ##\gamma ##" (## \gamma .f = \gamma \cdot f## which is linear).
 
Last edited:
  • Like
Likes SemM
  • #7
fresh_42 said:
One further remark:
In case ##\gamma ## is actually ##\gamma \cdot \operatorname{id} = \gamma \cdot I## the operator ##T## becomes linear, as the sum of two linear operators is linear again. In this case we have:
$$
T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) + \gamma \cdot I (f+g)= {i}{\hbar} \dfrac{d}{dx}f + \gamma \cdot f + {i}{\hbar} \dfrac{d}{dx}g + \gamma \cdot g = T(f)+T(g)
$$
so all depends on how to read ##\gamma ##, as a constant translation "plus ##\gamma ##" (## \gamma . f = + \gamma ## which is non-linear) or as the constant linear operator "times ##\gamma ##" (## \gamma .f = \gamma \cdot f## which is linear).
Thanks for this Fresh!

##\gamma## is a real physical constant. However, you mention ##\gamma \cdot \operatorname{id} = \gamma \cdot I## does it mean that ##\gamma## should be a complex number in order to make T linear?
 
  • #8
SemM said:
##\gamma## is a real physical constant. However, you mention ##\gamma \cdot \operatorname{id} = \gamma \cdot I## does it mean that ##\gamma## should be a complex number in order to make T linear?
It doesn't matter whether ##\gamma## is real or complex. Important is what it means. For example, let ##\gamma = -i \hbar c## for any constant number ##c \in \mathbb{C}##, which means ##c## can also be real. It doesn't matter. Then what is ##T.e^{cx}=T(e^{cx})\,##? Is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma = i \hbar c\cdot (e^{cx} -1)
$$
or is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma \cdot e^{cx} = i \hbar c\cdot (e^{cx} - e^{cx}) = 0
$$
The first case is non-linear, because ##\gamma ## is a constant translation away from the origin, whereas the second case is linear and ##e^{cx}## is an eigenvector of ##T## with eigenvalue ##0##.

Both possibilities are usually denoted simply by ##T = i \hbar \dfrac{d}{dx} + \gamma##. But the first case means ##T.f=T(f)=i \hbar f' + \gamma## and the second means ##T.f =T(f)= i \hbar f' + \gamma \cdot f##. In the second case, the operator is ##T= i \hbar \dfrac{d}{dx} + \gamma = i \hbar \dfrac{d}{dx} + \gamma \cdot I## with the identity operator ##I=id=1##. Whether ##T## is linear or not, depends on how applies ##\gamma ## on a function ##f##. Doesn't apply ##\gamma ## on ##f## at all, or does it ## f \longmapsto \gamma \cdot f##. The latter is linear.
 
  • Like
Likes SemM
  • #9
fresh_42 said:
It doesn't matter whether ##\gamma## is real or complex. Important is what it means. For example, let ##\gamma = -i \hbar c## for any constant number ##c \in \mathbb{C}##, which means ##c## can also be real. It doesn't matter. Then what is ##T.e^{cx}=T(e^{cx})\,##? Is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma = i \hbar c\cdot (e^{cx} -1)
$$
or is it
$$
T.e^{cx}=T(e^{cx})= i \hbar \dfrac{d}{dx} e^{cx} + \gamma.e^{cx} = i \hbar c \cdot e^{cx} + \gamma \cdot e^{cx} = i \hbar c\cdot (e^{cx} - e^{cx}) = 0
$$
The first case is non-linear, because ##\gamma ## is a constant translation away from the origin, whereas the second case is linear and ##e^{cx}## is an eigenvector of ##T## with eigenvalue ##0##.

Both possibilities are usually denoted simply by ##T = i \hbar \dfrac{d}{dx} + \gamma##. But the first case means ##T.f=T(f)=i \hbar f' + \gamma## and the second means ##T.f =T(f)= i \hbar f' + \gamma \cdot f##. In the second case, the operator is ##T= i \hbar \dfrac{d}{dx} + \gamma = i \hbar \dfrac{d}{dx} + \gamma \cdot I## with the identity operator ##I=id=1##. Whether ##T## is linear or not, depends on how applies ##\gamma ## on a function ##f##. Doesn't apply ##\gamma ## on ##f## at all, or does it ## f \longmapsto \gamma \cdot f##. The latter is linear.
Thanks Fresh, the ##\gamma## is applied on the function f indeed. This becomes inevitably nonlinear. The solution to the ODE ##TT'\psi= 0## becomes therefore with complex constants, and it has no hermitian counterpart. This becomes a nontrivial case, where no physical sensible properties can be derived.
 
  • #10
Dear Fresh42, can one relate non-linearity with asymmetricity of the operator? I noticed in an attachment given by Dr Du in another thread, that linear operators that vanish in the integral

\begin{equation}
\langle\psi, -i\hbar d/dx\phi\rangle - \langle-i\hbar d/dx\phi, \psi\rangle
\end{equation}

are considered as symmetric.

In this thread, considering T for instance in the equation above it gives:

\begin{equation}
\langle\psi, (i\hbar d/dx+\gamma)\phi\rangle - \langle (i\hbar d/dx+\gamma) \phi, \psi\rangle
\end{equation}

this is still symmetric.

Do you have a reference of symmetricity and operator linearity, or are these two irrelated?

Thanks!
 
  • #11
SemM said:
Do you have a reference of symmetricity and operator linearity, or are these two irrelated?
Symmetry (or anti-symmetry) is unrelated to the linearity of the operators. An operator is a mapping, a function. It can have several properties and linearity is one of them. On the other hand is the symmetry in the expressions above a relation between two operators which have nothing to do with the special nature of each of them. If ## \langle a,b \rangle = \langle b,a \rangle## then you can call ##a## and ##b## symmetric, regardless what ##a## or ##b## are (as long the product ##\langle\; , \; \rangle## is defined of course).

It is as if someone said: "Tarjei Bø and Johannes Thingnes Bø are brothers." This is a symmetric relationship and it doesn't matter if we say Tarjei is Johannes Thingnes' brother or the other way around. It is a property between them. Now whether they are biathletes or not is a completely different matter.

And by the way, the longer I think about your operator ##i \hbar \dfrac{d}{dx} - \gamma## the more I have the feeling that it has to be read as ##i \hbar \dfrac{d}{dx} - \gamma\cdot I##, in which case it is a linear operator. But I don't have the book, so I can't know for sure.
 
  • Like
Likes SemM
  • #12
fresh_42 said:
Symmetry (or anti-symmetry) is unrelated to the linearity of the operators. An operator is a mapping, a function. It can have several properties and linearity is one of them. On the other hand is the symmetry in the expressions above a relation between two operators which have nothing to do with the special nature of each of them. If ## \langle a,b \rangle = \langle b,a \rangle## then you can call ##a## and ##b## symmetric, regardless what ##a## or ##b## are (as long the product ##\langle\; , \; \rangle## is defined of course).

It is as if someone said: "Tarjei Bø and Johannes Thingnes Bø are brothers." This is a symmetric relationship and it doesn't matter if we say Tarjei is Johannes Thingnes' brother or the other way around. It is a property between them. Now whether they are biathletes or not is a completely different matter.

And by the way, the longer I think about your operator ##i \hbar \dfrac{d}{dx} - \gamma## the more I have the feeling that it has to be read as ##i \hbar \dfrac{d}{dx} - \gamma\cdot I##, in which case it is a linear operator. But I don't have the book, so I can't know for sure.
Thanks for the excellent illustration on symmetry. So one can conclude that symmetry of two operators has no implications on the properties of the operator itself, only the relationship to one another, and if they are for instance related in a matrix as pairs a, b, symmetry may be used in some fashion, should it be necessary or simplifying. But this is , according to what I read from your answer, not directly relevant with the properties of the operator itself and what it does to a function in its domain.

About the writing. These two operators are parts of a Hamittonian, that is part of a paper. I am thinking based on your point on it, that , given that it must be paired with T', thus TT', the identity part in I, becomes redundant, and it results as:

\begin{equation}
(h^2 \frac{d^2}{dx^2} - 2i \gamma \frac{d}{dx} + \gamma^2)\psi = 0
\end{equation}

However, in order to study this whole operator, the two operators T and T', were investigated to see how the overall operator behaves.

If this is however not sufficient, and there are some parts that remain in question, please let me know!

Cheers
 
  • #13
fresh_42 said:
And by the way, the longer I think about your operator ##i \hbar \dfrac{d}{dx} - \gamma## the more I have the feeling that it has to be read as ##i \hbar \dfrac{d}{dx} - \gamma\cdot I##, in which case it is a linear operator. But I don't have the book, so I can't know for sure.

Hi Fresh , I have looked into this in Kreyszigs Functional Analysis, and he writes the same as you here:

quote:
##"A^{*} = \beta(\alpha Q-i/\alpha D)"##
##"A = \beta(\alpha Q+i/\alpha D)"##

and

"##A^{*} A=\pi/h\big(\alpha^2Q^2+1/\alpha^2D^2-h/2\pi Ĩ \big)##"
"##A^{*} A=\pi/h\big(\alpha^2Q^2+1/\alpha^2D^2+h/2\pi Ĩ \big)##"

and hence:
##AA^{*} - A^{*} A=Ĩ##
end quote

So applied on these operators here
T = ##i \hbar \dfrac{d}{dx} - \gamma##
T* = ##-i \hbar \dfrac{d}{dx} - \gamma##

using the identity matrix on the constant (Why?) one gets:

T = ##i \hbar \dfrac{d}{dx} - \gamma Î##
T* =##-i \hbar \dfrac{d}{dx} - \gamma Î##

and then
##AA^{*} - A^{*} A=Ĩ##

as Kreyszig. This means T and T* are one anothers Hilbert-adjoint operator. But does it mean that if they are Hilbert-adjoint pair, are they self-adjoint as well?
 
  • #14
Hey SemM.

Linear operators have the property that they are invertible [operators have to be] and that they have the properties f(A + B) = f(A) + f(B) and f(sA) = s*f(A).

The stuff with hermitian operators means that you have complex numbers meaning you have to have s = a + b*i and make it all consistent with complex numbers but the idea is the same as with a real number case.

Non-linear operators will have some sort of Taylor series expansion where you have a spectrum that is applied to multiple powers of the diagonalized operators itself.

There is a result in linear algebra where you have f(O) = P*f(D)*P_inverse where D contains the eigenvalues and P contains eigen-vectors along with f() being some transformation [including non-linear] that you do on the operator O itself. As long as it can be properly diagonalized, then you can transform an operator with some function.

There is a result in operator algebras which generalizes this to an operator on a Hilbert space [which is infinite dimensional and results converge to get numbers not positive or negative infinity - i.e. real] and you can literally expand a non-linear operator in the same way you do with a Taylor series expansion of a function - but you need to get its spectrum [eigenvalues and eigenvectors] and you are able to expand it enough terms to get an approximation if it's highly non-linear.

I'd read about diagonalization first and then read about applying a function to the eigenvectors before looking at the result of the Taylor-like expansion in a book on operator algebras or C* algebras and see what a non-linear operator does to the eigenvalues. Engineering and pure mathematics books can do this for you.

You can do things like e^A where A is an operator and provided you have the right conditions for A then any smooth/analytical function will be able to be applied on the operator and you can calculate it with some precision in error.

When you look at the derivative terms for the non-linear operator, they don't drop off after the linear term - just like you would expect for a non-linear function.
 
  • Like
Likes SemM
  • #15
chiro said:
Linear operators have the property that they are invertible [operators have to be]
Sorry, but where did you read this?
 
  • #16
SemM said:
But does it mean that if they are Hilbert-adjoint pair, are they self-adjoint as well?
Look up the definition of "self-adjoint operators."
 
  • #17
Mark44 said:
Look up the definition of "self-adjoint operators."

I did, and it does not imply that.
 
  • #18
SemM said:
I did, and it does not imply that.
Hilbert-adjoint is a bit of a strange name for it. Drop Hilbert here. Imagine an operator ##T \, : \, H_1 \longrightarrow H_2## and its adjoint operator ##T^*##. Then ##\langle Tx,y \rangle_{H_2} = \langle x,T^*y \rangle_{H_1}##. Now what does self-adjoint require and could mean?
 
  • Like
Likes SemM
  • #19
fresh_42 said:
Hilbert-adjoint is a bit of a strange name for it. Drop Hilbert here. Imagine an operator ##T \, : \, H_1 \longrightarrow H_2## and its adjoint operator ##T^*##. Then ##\langle Tx,y \rangle_{H_2} = \langle x,T^*y \rangle_{H_1}##. Now what does self-adjoint require and could mean?

A self-adjoint operator requires a symmetric relation to its own Herminian counterpart, so that ##TT^*=T^*T##, and therefore if these where two observables, they would be measurable at the same time, by the commutation relation ##TT^*-T^*T=0##.
 
  • #20
SemM said:
A self-adjoint operator requires a symmetric relation to its own Herminian counterpart, so that ##TT^*=T^*T##, and therefore if these where two observables, they would be measurable at the same time, by the commutation relation ##TT^*-T^*T=0##.
No, it doesn't have anything to do with commutativity. Self-adjoint means ##T=T^*##, it is adjoint to itself. This requires ##H_1=H_2##. Whether ##[T,T^*]=0## or not is a different question and not always true. Of course it is if ##T=T^*##.
 
  • Like
Likes SemM
  • #21
fresh_42 said:
No, it doesn't have anything to do with commutativity. Self-adjoint means ##T=T^*##, it is adjoint to itself. This requires ##H_1=H_2##. Whether ##[T,T^*]=0## or not is a different question and not always true. Of course it is if ##T=T^*##.
Ok , thanks for clarifying that!
 
  • #23
chiro said:
For the invertibility of a linear operator:

https://en.wikipedia.org/wiki/Continuous_linear_operator

Note the inverse property - you can't have that unless it's invertible.

The notation on that page uses ##A^{-1}## to represent the pre-image of the operator ##A##, not it's inverse.

Note that the linear operator that maps a space to the zero vector is trivially continuous and clearly not invertible.

In general there is no requirement for a continuous linear operator, or any continuous function, to be one to one, hence invertible.
 
  • Like
Likes StoneTemplePython
  • #24
Check that the closure of the kernel implies invertibility.
 
  • #25
##\mathcal{H} \longrightarrow \{0\}## has a closed kernel and is definitely not invertible. The best argument against inverses is, that the dual space ##\mathcal{H}^* =\{\, T\, : \,\mathcal{H} \rightarrow \mathbb{R}\text{ or }\mathbb{C}\}## is again a complete vector space, and thus the zero is really needed, and addition won't work with invertibility.
 
  • #26
fresh_42 said:
One further remark:
In case ##\gamma ## is actually ##\gamma \cdot \operatorname{id} = \gamma \cdot I## the operator ##T## becomes linear, as the sum of two linear operators is linear again. In this case we have:
$$
T(f+g) = {i}{\hbar} \dfrac{d}{dx} (f+g) + \gamma \cdot I (f+g)= {i}{\hbar} \dfrac{d}{dx}f + \gamma \cdot f + {i}{\hbar} \dfrac{d}{dx}g + \gamma \cdot g = T(f)+T(g)
$$
so all depends on how to read ##\gamma ##, as a constant translation "plus ##\gamma ##" (## \gamma . f = + \gamma ## which is non-linear) or as the constant linear operator "times ##\gamma ##" (## \gamma .f = \gamma \cdot f## which is linear).
You should also specify linear over _what_? Complexes, Reals, etc.
 
  • #27
chiro said:
Check that the closure of the kernel implies invertibility.
A function is invertible iff it is one-to-one. For a linear operator this is equivalent to the kernel being the zero vector only.
 
  • #28
Do you know how to link the nullity to the invertibility of the matrix? Do you understand why I said that?
 
  • #29
Post #27:
PeroK said:
A function is invertible iff it is one-to-one. For a linear operator this is equivalent to the kernel being the zero vector only.

Post #28:
chiro said:
Do you know how to link the nullity to the invertibility of the matrix?
Didn't @PeroK just do that in the post before yours?
 
  • #30
The point is that it is invertible - which is what I said a long time ago.

Do you agree or not?

If there is no nullity in the matrix and it is square it must be invertible.

Do you agree or not?
 
  • #31
chiro said:
The point is that it is invertible - which is what I said a long time ago.

Do you agree or not?

If there is no nullity in the matrix and it is square it must be invertible.

Do you agree or not?
Your post doesn't make any sense. I am banning your from this thread.
 
  • #32
I agree it makes life harder but I suggest we not block people just becaue they don't understand whatn they are saying, even if they don't realize it. At some point they may get the message. Just a suggestion, of course I am suggesting work for other people! feel free to ignore.
 
  • #33
mathwonk said:
I agree it makes life harder but I suggest we not block people just becaue they don't understand whatn they are saying, even if they don't realize it. At some point they may get the message. Just a suggestion, of course I am suggesting work for other people! feel free to ignore.
This is not the right place to discuss this. The question about invertibility had already been off-topic, since it has nothing to do with linearity, which is the subject of this thread.

The OP's question was: Has commutativity something to do with linearity?
The answer is: no.

Thread closed.
 

Related to Does Commutativity Affect Linearity?

1. What is commutativity?

Commutativity is a mathematical property that refers to the ability to change the order of operations without changing the result. In other words, if a mathematical operation is commutative, the order in which the numbers are added, multiplied, or divided will not affect the final answer.

2. What is linearity?

Linearity is a mathematical property that refers to the behavior of a function or equation. A linear function or equation has the property that when the input is multiplied by a constant, the output is also multiplied by that constant. This property is also known as the "scaling" or "proportionality" property.

3. How does commutativity affect linearity?

Commutativity does not affect linearity. Linearity is a property of a function or equation, while commutativity is a property of mathematical operations. Linearity is independent of the order of operations, so commutativity has no impact on it.

4. Can a function be commutative and linear at the same time?

Yes, a function can be both commutative and linear. For example, the function f(x) = 2x is both commutative and linear. It is commutative because the order of operations does not affect the result (2x is the same as x2), and it is linear because when the input is multiplied by a constant, the output is also multiplied by that constant.

5. Are all mathematical operations commutative?

No, not all mathematical operations are commutative. Addition and multiplication are commutative, but subtraction and division are not. For example, 3 - 2 is not the same as 2 - 3, and 10 ÷ 5 is not the same as 5 ÷ 10. In general, operations involving subtraction and division are not commutative.

Similar threads

  • Linear and Abstract Algebra
Replies
23
Views
3K
Replies
19
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Differential Equations
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
361
  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
581
  • Advanced Physics Homework Help
Replies
4
Views
1K
Replies
2
Views
2K
Back
Top