Entropy in isolated quantum systems

In summary, the von Neumann entropy of an isolated quantum system is not constant at all times, depending on the eigenvalues of the density matrix. If the system is in a relaxation process, then the entropy will not remain constant, as the environment impacts the system.
  • #1
kith
Science Advisor
1,436
533
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?
 
Last edited:
Physics news on Phys.org
  • #2
kith said:
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?

Hmmm .. I am not sure why you say unitary time-evolution implies that the von-Neumann entropy is constant for an isolated quantum system at all times. Consider the case of a molecule whose highest-energy normal mode has been excited by a photon. At t=0, the energy from the photon is localized as one quantum of excitation in a single vibrational mode (pure state, zero von-Neumann entropy). Over time, intramolecular vibrational redistribution (IVR) caused by the anharmonic couplings of other normal modes to the initially excited one, will cause that excitation energy to become "randomly" distributed (colloquial use only here .. I doubt that it is really a stochastic process, particularly since we are already in the context of microscopic reversibility) over all the modes in the molecule (mixed state, non-zero von-Neumann entropy).

Have I somehow misunderstood something in the above example?
 
  • #3
SpectraCat said:
Hmmm .. I am not sure why you say unitary time-evolution implies that the von-Neumann entropy is constant for an isolated quantum system at all times.
Isolated implies unitarian time evolution. The von-Neumann-entropy depends only on the eigenvalues of the density matrix and these are not changed under any unitarian transformation.

Your example is a relaxation process; such processes usually can't be described by unitarian time evolution. As environment, you have at least the electromagnetic field present.

[Typical environments are much larger than the system of interest, so the impact of a change in the system on the environment can be neglected. This leads to an effective time evolution of the system density matrix which is not unitarian (see the http://en.wikipedia.org/wiki/Lindblad_equation" for example).]
 
Last edited by a moderator:
  • #4
In discussions about the classical, thermodynamical arrow of time, many people didn't consider Loschmidt's paradox paradox. So what about the quantum case? Maybe I should have chosen a more spectacular title like "the quantum arrow of time" to get more comments. ;)
 
Last edited:
  • #5
kith said:
Isolated implies unitarian time evolution. The von-Neumann-entropy depends only on the eigenvalues of the density matrix and these are not changed under any unitarian transformation.

Your example is a relaxation process; such processes usually can't be described by unitarian time evolution. As environment, you have at least the electromagnetic field present.

[Typical environments are much larger than the system of interest, so the impact of a change in the system on the environment can be neglected. This leads to an effective time evolution of the system density matrix which is not unitarian (see the http://en.wikipedia.org/wiki/Lindblad_equation" for example).]

Ok ... I was not trying to create an example of a relaxation process. I was only trying to fix the total internal energy of the isolated quantum system, and have all of the energy start out in a single mode. It seems to me that particular starting state (all of the energy localized in a single mode), ought to have a lower entropy than a state where the energy is arbitrarily distributed over several modes of the molecule, since the latter case will likely have several degeneracies (assuming the density of states is sufficiently high). The IVR process by which the vibrational energy becomes redistributed over the various modes is well-understood and should be describable by unitary time evolution. In terms of the density matrix, it will simply evolve with time, as the populations of the different states change. Based on the definition of the von Neumann entropy:

[tex]S=k_B Tr[\rho ln\rho][/tex]

it seems like it would only stay constant if the normal modes of the molecule were strictly orthogonal so that the density matrix remained diagonal at all times. However, since the IVR process proceeds through cross-anharmonicities that couple the nominally orthogonal normal modes, this will not be the case.

Note that I am not that familiar with the ins and outs of density matrix formulations, or of von Neumann entropy, so it is entirely possible that I have made a mistake in the above analysis. If I have, I would be happy to learn more.
 
Last edited by a moderator:
  • #6
kith said:
In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times.

I don't understand von Neumann entropy very well, but I do know that if you have a particle described by a gaussian wavefunction (momentum and position wave functions are Fourier transforms of each other), the sum of the information entropies of position and momentum increases in time.
 
  • #7
kith said:
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?
First, the Loschmidt's paradox is a paradox in classical statistical mechanics, not quantum mechanics.

Second, you are right that unitary evolution of the whole system conserves entropy.

Third, when a quantum subsystem is entangled to another subsystem, then entropy of each subsystem may increase with time. If you see a paradox with it, note that the entropy of the whole system is not the sum of entanglement entropies of its subsystems.
 
  • #8
kith said:
I'm still puzzled by Loschmidt's paradox.



Check out http://en.wikipedia.org/wiki/Mutual_information#Applications_of_mutual_information

Basically, from an information-theoretic point of view, in the classical case, there are two entropies, the marginal entropy that you get from assuming the particle energies are uncorrelated, and the mutual entropy that you get from correlations of the energies among the particles. Liouville's theorem says that their sum remains constant. If you start out with uncorrelated energies, all the entropy is in the marginal entropy. As time goes by, collisions occur and the energies become correlated; marginal entropy increases, the mutual entropy decreases. The thermodynamic entropy is the marginal entropy - specifying a thermodynamic state gives you no information about correlations. So the thermodynamic entropy increases, while the total information entropy remains constant. When Boltzmann derived his H-theorem, this was what he was basically doing - assuming that the particle energies were forever uncorrelated, which gave an increase in entropy.
 
  • #9
Maybe I should emphasize my main questions. I'm not sure if the answers lie entirely in quantum mechanics, maybe my knowledge of classical statistical mechanics needs some refreshing.

1) Is it possible, that a time asymmetry arises in an isolated quantum system? If not, this should mean that there is no universal arrow of time. Also no heat death of the universe. Shouldn't it?
2) The arrow of time we perceive in our systems of interest, would ultimately be due to interactions with the environment which lead to non-unitarian time evolution. Why does this arrow always point in the same direction respectively why do typical environments always increase the entropy of the system?

[Thanks for all replies so far, I need some time to answer them.]
 
  • #10
Rap said:
I don't understand von Neumann entropy very well, but I do know that if you have a particle described by a gaussian wavefunction (momentum and position wave functions are Fourier transforms of each other), the sum of the information entropies of position and momentum increases in time.
Just to understand you correctly: by the "information entropy of position" you mean the Shannon entropy of the probability distribution given by
[tex]|\Psi(\vec{r})|^2[/tex]
(according to wikipedia, there are technical difficulties to extend the concept of the Shannon entropy to the continuous case but I guess that's not important for us now)

I have to think about this. Right now, I don't see were the time asymmetry in this broadening of wave packets comes from, since the time evolution is unitarian.

However this only remotely touches my initial question. The Von Neumann entropy S is a measure for the purity of states. An isolated wave packet remains a pure state at all times, so S=0 at all times.
 
  • #11
Demystifier said:
First, the Loschmidt's paradox is a paradox in classical statistical mechanics, not quantum mechanics.
Why not? There's the quantum mechanical H-Theorem which leads to the second law of thermodynamics.

Demystifier said:
Third, when a quantum subsystem is entangled to another subsystem, then entropy of each subsystem may increase with time. If you see a paradox with it, note that the entropy of the whole system is not the sum of entanglement entropies of its subsystems.
I had completely forgotten to consider that. Thanks for pointing it out!
 
Last edited:
  • #13
SpectraCat said:
Ok ... I was not trying to create an example of a relaxation process. I was only trying to fix the total internal energy of the isolated quantum system, and have all of the energy start out in a single mode. It seems to me that particular starting state (all of the energy localized in a single mode), ought to have a lower entropy than a state where the energy is arbitrarily distributed over several modes of the molecule, since the latter case will likely have several degeneracies (assuming the density of states is sufficiently high). The IVR process by which the vibrational energy becomes redistributed over the various modes is well-understood and should be describable by unitary time evolution. In terms of the density matrix, it will simply evolve with time, as the populations of the different states change. Based on the definition of the von Neumann entropy:

[tex]S=k_B Tr[\rho ln\rho][/tex]

it seems like it would only stay constant if the normal modes of the molecule were strictly orthogonal so that the density matrix remained diagonal at all times. However, since the IVR process proceeds through cross-anharmonicities that couple the nominally orthogonal normal modes, this will not be the case.

Note that I am not that familiar with the ins and outs of density matrix formulations, or of von Neumann entropy, so it is entirely possible that I have made a mistake in the above analysis. If I have, I would be happy to learn more.

The full density matrix of an isolated system evolves as [tex] \rho \rightarrow U \rho U^+ [/tex]. It follows that [tex] S = - \mbox{tr}(\rho \log{\rho}) [/tex] is invariant. For example, the eigenvalues [tex] p_i [/tex] of [tex] \rho [/tex] and [tex] U \rho U^+ [/tex] are identical and [tex] S = \sum_i - p_i \log{p_i} [/tex] (just compute the trace by going to the diagonal basis).

Of course, as others here have mentioned, the entropy of a subsystem certain can increase even if the entire system is evolving unitarily.
 
  • #14
Physics Monkey said:
The full density matrix of an isolated system evolves as [tex] \rho \rightarrow U \rho U^+ [/tex]. It follows that [tex] S = - \mbox{tr}(\rho \log{\rho}) [/tex] is invariant. For example, the eigenvalues [tex] p_i [/tex] of [tex] \rho [/tex] and [tex] U \rho U^+ [/tex] are identical and [tex] S = \sum_i - p_i \log{p_i} [/tex] (just compute the trace by going to the diagonal basis).

Of course, as others here have mentioned, the entropy of a subsystem certain can increase even if the entire system is evolving unitarily.

Thank you for that explanation ... that is the way I understood the math to work in the case where the density matrix is diagonal. The issue with the example that I gave is that the off-diagonal elements of the matrix are time dependent ... i.e. the magnitude of the anharmonic cross-couplings between the vibrational modes are dependent on the population of those modes (by population I mean the number of vibrational quanta in a given mode). I believe the ramifications of this are that you cannot choose a unique basis that diagonalizes the density matrix at all times, but like I said, I am not that well-versed in the details of density matrices, so I may not have that completely correct.

However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant. It seems intuitively clear to me from the phenomenology of intramolecular vibrational redistribution that the process is entropically driven. In other words, the energy starts out localized as a single quantum of excitation in a single vibrational mode, and then becomes "randomized" as one or more quanta of excitation in multiple vibrational modes with lower energy. The total internal energy of the system remains constant, but the probability of the energy finding its way back into the mode that was initially excited is (I think) vanishingly small. That seems consistent with the evolution of the state from low entropy (all the energy in a single mode) to higher entropy (the energy redistributed among many modes).

A possible counter-argument to the description I gave above might be that, even though the energy is "randomized", at any instant in time it is described by a unique "pure state" of the system, which would have the same von Neumann entropy (zero) as the initial state with a single quantum of excitation in a single mode. This argument is probably valid for small molecules where the density of states is low, however molecules belonging to symmetric point groups with degenerate irreducible representations have formal degeneracies that would give rise to non-zero entropies for particular combinations of vibrational quanta.

I would appreciate any insights you have on this ...
 
  • #15
kith said:
Just to understand you correctly: by the "information entropy of position" you mean the Shannon entropy of the probability distribution given by
[tex]|\Psi(\vec{r})|^2[/tex]
(according to wikipedia, there are technical difficulties to extend the concept of the Shannon entropy to the continuous case but I guess that's not important for us now)

Yes. If [itex]\Psi_x(x,t)[/itex] is the position wavefunction (one dimensional), and [itex]\Psi_p(p,t)[/itex] is the momentum wavefunction, and the associated probabilities [itex]|\Psi_x(x,t)|^2[/itex] and [itex]|\Psi_p(p,t)|^2[/itex] are Gaussian, such that Heisenberg uncertainty holds exactly at time t=0, then the entropies are:
[tex]H_x = -\int_{-\infty}^\infty |\Psi_x(x,t)|^2 \ln(|\Psi_x(x,t)|^2)dx[/tex]
[tex]H_p = -\int_{-\infty}^\infty |\Psi_p(p,t)|^2 \ln(|\Psi_p(p,t)|^2)dp[/tex]
and (setting [itex]\hbar=1[/itex]):
[tex]H_x+H_p = \ln(e\pi\sqrt{1+\tau^2})[/tex]
where
[tex]\tau=\frac{t}{2m\sigma^2}[/tex]
where [itex]\sigma^2[/itex] is the variance of the position Gaussian.
 
  • #16
kith said:
Why not? There's the quantum mechanical H-Theorem which leads to the second law of thermodynamics.
OK, I admit that there is also a quantum analogue of the Loschmidt paradox.
 
  • #17
kith said:
1) Is it possible, that a time asymmetry arises in an isolated quantum system?
Certainly yes. It simply means that there exists a solution of the Schrodinger equation psi(x,t) such that |psi(x,t)| is not equal to |psi(x,-t)|. In fact, most solutions are such.

kith said:
2) The arrow of time we perceive in our systems of interest, would ultimately be due to interactions with the environment which lead to non-unitarian time evolution.
I don't think so. I think it is due to the initial condition, which, for some reason, was "ordered".

kith said:
Why does this arrow always point in the same direction respectively why do typical environments always increase the entropy of the system?
Perhaps the best answer is provided by
http://arxiv.org/abs/1011.4173
This is a classical explanation, but the idea can easily be extended to the quantum case as well. Indeed, the classical explanation above has been partially motivated by a work in quantum mechanics
http://arxiv.org/abs/0802.0438 [Phys.Rev.Lett.103:080401,2009]

Note, in particular, (in the first paper above) that there are many inequivalent definitions of entropy. Some of them do not depend on time, while others do. So one must be very careful what one means by "entropy" when claims that entropy does or does not increase with time.
 
Last edited:
  • #18
kith said:
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?
Systems observable by us are never isolated, so the entropy increases.

The universe as a whole is isolated - it is the _only_ isolated system containing us! Therefore its entropy is constant. But this has no observable consequences since there is no way to measure the total entropy of an inhomogeneous system of this size.
 
  • #19
kith said:
2) The arrow of time we perceive in our systems of interest, would ultimately be due to interactions with the environment which lead to non-unitarian time evolution. Why does this arrow always point in the same direction respectively why do typical environments always increase the entropy of the system?

Because the interaction with the environment consists of a huge number of small contributions of nearly independent tiny subsystems of the environment, so that the law of large numbers applies (as evidenced by the techniques used in nonequilibrium statistical mechanics).
 
  • #20
SpectraCat said:
However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant.
I'm not very familiar with molecular dynamics, but it usually goes like this: if you consider an isolated system like your molecule and you start in an eigenstate of the corresponding Hamiltonian, the system stays in this state forever, because all eigenstates are stationary states. From a physical point of view, this is counterintuitive because in real experiments, you always have the electromagnetic field present which gives you finite lifetimes and relaxation rates.
 
  • #21
A. Neumaier said:
The universe as a whole is isolated - it is the _only_ isolated system containing us! Therefore its entropy is constant. But this has no observable consequences since there is no way to measure the total entropy of an inhomogeneous system of this size.
Ok. So the picture of a heat death of the universe is wrong?

A. Neumaier said:
Because the interaction with the environment consists of a huge number of small contributions of nearly independent tiny subsystems of the environment, so that the law of large numbers applies (as evidenced by the techniques used in nonequilibrium statistical mechanics).
Can you elaborate this? I still don't get how the time symmetric interaction laws of these subsystems can lead to a time asymmetrical law like the H-Theorem, and how the law of large numbers explains this.
 
Last edited:
  • #22
For the other answers -especially to Rap and Demystifier- I need more time.
 
  • #23
kith said:
I'm not very familiar with molecular dynamics, but it usually goes like this: if you consider an isolated system like your molecule and you start in an eigenstate of the corresponding Hamiltonian, the system stays in this state forever, because all eigenstates are stationary states. From a physical point of view, this is counterintuitive because in real experiments, you always have the electromagnetic field present which gives you finite lifetimes and relaxation rates.

This is because real experiments are observed, and hence interact with the environment. Thus the unitary evolution applies only approximately, to the extent you can neglect the interaction.
 
  • #24
A. Neumaier said:
This is because real experiments are observed, and hence interact with the environment. Thus the unitary evolution applies only approximately, to the extent you can neglect the interaction.
There is no need of observation. The interaction with the electromagnetic field as environment is sufficient to lose the unitarity of time evolution.
 
  • #25
kith said:
Ok. So the picture of a heat death of the universe is wrong?
Yes. There is no hint at all that the universe should suffer a heat death. The story of the heat death came up by overinterpreting the second law, at a time when its connection to mechanics was not yet understood.

kith said:
Can you elaborate this? I still don't get how the time symmetric interaction laws of these subsystems can lead to a time asymmetrical law like the H-Theorem, and how the law of large numbers explains this.
Any derivation of the second law makes some form of assumption on the (mixed) initial state (and often also on each later state). If this assumption is satisfied then the evolution from this state into the future satisfies the second law. But so does the time-reversed evolution - i.e., going from that state into the past also increases the entropy. This shows that there is something artificial about this assumption.
The truth is that the assumption is only approximately satisfied at any time, and since the dynamics is chaotic, the uncertainty can have arbitrarily large consequences, but with arbitrarily small probability.

In Boltzmann's analysis the environment remains hidden, but acts by restoring the independence assumption at _all_ times. In the more refined version that works for more complex systems than ideal gases, one typically assumes that the state at some initial time is Gaussian (or, more technically, quasi-free), which is just of this kind. In this case, the environment consistsd of all the stuff that is not characterized by the comapatively few variables chosen to describe the macroscopic system, to which the Gaussian assumption is applied.

To understand things in an extremely simplified but appropriate setting, consider the Lorenz attractor - it is a time-reversal invariant ordinary differential equation in 3 variables, and nevertheless shows an increase of entropy for every initial ensemble close to the attractor, no matter whether you run the dynamics forward or backward. You can easily program the Lorenz attractor yourself in systems like Matlab or Mathematica, and convince you of the fact that once approximations are made, irreversibility ''follows'' easily from time-reversibility plus chaoticity.
 
  • #26
kith said:
There is no need of observation. The interaction with the electromagnetic field as environment is sufficient to lose the unitarity of time evolution.

This _is_ already observation. The system is observed once photons can leave the system, no matter whether some recording device such as a photographic plate or a human eye is there to receive the photons.
 
  • #27
A. Neumaier said:
This _is_ already observation. The system is observed once photons can leave the system, no matter whether some recording device such as a photographic plate or a human eye is there to receive the photons.

That statement seems to contradict the idea of entanglement, Bell's theorem, and the experiments by groups such as Aspect and Zeilinger. Those experiments show that, until one member of a photon pair has been destructively detected, the pair remains entangled. Furthermore, the entanglement seems to not be an exclusive feature of the original pair, but rather a property that can be transferred to other pairs of particles.

So from that point of view, there seems to be a definite distinction between the emission of a photon, and the detection of that photon.
 
  • #28
SpectraCat said:
That statement seems to contradict the idea of entanglement, Bell's theorem, and the experiments by groups such as Aspect and Zeilinger. Those experiments show that, until one member of a photon pair has been destructively detected, the pair remains entangled. Furthermore, the entanglement seems to not be an exclusive feature of the original pair, but rather a property that can be transferred to other pairs of particles.

So from that point of view, there seems to be a definite distinction between the emission of a photon, and the detection of that photon.

Of course there is such a distinction. But this doesn't contradict my statement.

Arguments about environment become meaningless if one changes the meaning of the terms ''system'' and ''environment'' during the argument. One therefore needs to specify beforehand what counts as the system and what counts as the environment, and then stick to that.

In Zeilinger's experiment, the system is the pair of entangled photons. The experimental arrangement ensures that nothing leaves this systerm until the measurement. (This is not easy. Without special precautions, the system decoheres long before it is measured. That's why entanglement experiemnts over large distances are difficult to perform.)

The context of my previous statement included the assumption that the system is coupled to the e/m environment (which therefore does not belong to the system). Thus photons leave _that_ system, which decoheres the latter.
 
  • #29
A. Neumaier said:
Systems observable by us are never isolated, so the entropy increases.

The universe as a whole is isolated - it is the _only_ isolated system containing us! Therefore its entropy is constant. But this has no observable consequences since there is no way to measure the total entropy of an inhomogeneous system of this size.

I would not dare to assign an entropy to the whole universe, neither a wavefunction etc.
I would even hesitate to call it isolated. Isolated means isolated from the effects of the surrounding. But there is no surrounding of the universe.
 
  • #30
DrDu said:
I would not dare to assign an entropy to the whole universe, neither a wavefunction etc.
I know that I am more daring, but with good grounds.

It is an undecidable querstion whether the state of the universe is pure or mixed, so, yes, one cannot necessarily assign to the universe a wave function, since this is possible only if the universe is in a pure state.

However, the universe must have a state. For if not, there would have to be a limit on the size of a system to be described by a state. This size would be completely arbitrary.

Statistical mechanics describes ordinary macroscopic matter very successfully by a mixed state, and the resulting hydrodynamic description seems to be an excellent model for the visible part of the universe. It would be very strange if such large systems are described by quantum mechanics (and hence have a state) but no such description would apply to even bigger systems - just because we cannot observe them. Cosmology would not make sense without allowing the universe to have a state.


DrDu said:
I would even hesitate to call it isolated. Isolated means isolated from the effects of the surrounding. But there is no surrounding of the universe.

Everything in the universe is coupled to its complement in the universe, hence not isolated - except if this complement is empty. Thus if the universe were not isolated then nothing is. Thus the term would be vacuous.

On the other hand, if we define the universe od an observer O as being the smallest isolated system containing O, it is conceivable that there are many universes - they just don't interact, so we cannot know anything about any universe except ours. If many universes existed, then their environment would not be empty but would consist of all other universes (from whom they are isolated).

In the spirit of Ockham's razor, we can however ignore all other universes and deny their existence, without _any_ loss of predictivity for the inhabitants of _our_ universe.
 
  • #31
A. Neumaier said:
However, the universe must have a state. For if not, there would have to be a limit on the size of a system to be described by a state. This size would be completely arbitrary.

I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.
 
  • #32
DrDu said:
I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.

In the Copenhagen spirit, a classical system is not an observer. The wave function is a device to encapsulate knowledge gained from previous measurements. A classical system has no "knowledge" unless it is a human or machine making QM calculations.
 
  • #33
A. Neumaier said:
In Boltzmann's analysis the environment remains hidden, but acts by restoring the independence assumption at _all_ times.

Yes, and restoring independence at all times amounts to ignoring the correlations that result from collisions. The total entropy is equal to the marginal entropy (the entropy resulting from assuming independence at all times) plus the mutual entropy (the entropy arising from the correlations). The total entropy is constant, the mutual entropy is identified with the thermodynamic entropy, and always increases.
 
  • #34
DrDu said:
I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.

What defines an observer, in physical terms? This is an ill-defined notion.

The advantage of my thermal interpretation (see https://www.physicsforums.com/showthread.php?t=490492 ) is that one doesn't need such il--defined terms to make sense of quantum mechanics. Classical systems are simply quantum mechanical systems in which all observables of interest are almost certain in a well-defined sense. And an observer is nowhere needed since the whole physics is observer-independent.
 
  • #35
Rap said:
Yes, and restoring independence at all times amounts to ignoring the correlations that result from collisions.
And ignoring this is the reason for the increase of entropy.
 

Similar threads

Replies
1
Views
700
  • Quantum Physics
2
Replies
39
Views
3K
Replies
3
Views
901
Replies
48
Views
2K
Replies
13
Views
2K
Replies
1
Views
987
  • Quantum Physics
Replies
1
Views
746
Replies
1
Views
518
  • Quantum Physics
Replies
4
Views
1K
Back
Top