Observer independent collapses?

In summary, the conversation discusses various interpretations and theories related to the collapse of wave functions in quantum mechanics. The concept of an observer is debated, with some arguing that it is necessary for wave function collapse while others believe it is only a hypothetical construct. Decoherence theory and the possibility of a hypothetical alien being with superior abilities to perceive reality are also mentioned. Overall, there is no consensus on the role of consciousness in wave function collapse and the concept remains open to interpretation.
  • #1
reynolds2
3
0
Are there any inctances where by examing the trajectory where a subatom went are there experiments where wave functions collapse independently of an observer?
Also I was wondering if wave functions collapse naturlley in the universe when they collide into other subatoms or such things.

A thought experiment if you blindfold someone lie to them that you are going to put them in a square room with nothing in it then put ear phones on them and stick them in a corner of the room and let them wander aimlessly for a way out. All the while there is a chair in the middle of the room according to the "observer" interpretation of QM physics the person should walk right through the chair. Obviously they don't and bump right into the chair. How do those who preach the counsciess observer interpretaion of QM physics explain this?
 
Physics news on Phys.org
  • #2
I think you have a misunderstanding of what an Observer is. EVERYTHING is an observer. Every particle in your body observes every other particle that it interacts with.
 
  • #3
Actually there is no real consensus on what an "observation" actually is, other than the kind we use in physics. Those require human observers. As for any other meaning of the word, some kind of interpretation is required, and the language can become obscure. What we can all agree on is that when we do an observation of a particle that has interacted strongly with its environment, we get different kinds of outcomes. The classic example of this is the two-slit experiment wherein there is "which way" information about the slit the particle went through-- that gives a different pattern over many trials than if there is no which way information, and it doesn't matter if the observer of the pattern is privy to that which way information. So that's why they bump into the chair whether they expect it to be there or not-- the observer does not need to know the information, if the environmental setting has established it as true. All the same, the chair is not observed until the observer bumps into it-- but the information that the chair is there is already encoded in the environment. Whether that counts as an "observation" is interpretation dependent.
 
  • #4
An interaction with another quantum system is NOT a measurement, as far as I can tell, from the various literature I've read. Squires noted in his book 'Conscious Mind in a Physical World' that anything described by the Scrodinger equation cannot transform into a mixed (definite) state.
 
  • #5
I think this should disprove empiriccly that a human observer is not neccesary.
http://arxiv.org/PS_cache/arxiv/pdf/1009/1009.2404v2.pdf

I think that observer is an outdated term used in the early 1900 to describe something else entirly. When interaction takes place the system decoheres independent of observation (sloppy word). Thats why I asked if there were insances where wave functions collapse in nature naturally without human observers?
 
  • #6
reynolds2 said:
I think this should disprove empiriccly that a human observer is not neccesary.
http://arxiv.org/PS_cache/arxiv/pdf/1009/1009.2404v2.pdf
This type of argument misses a very important issue-- the difference between a real observer and a hypothetical observer (by the latter I mean a kind of conceptual "mini me" that we insert in our descriptions of some physical event). It requires a rather radical idealist to assert that reality is only actualized when it enters the mind of a physicist or is in some other way registered or perceived by an intelligence. Most scientists are realists enough to be able to imagine a reality that exists outside of their intelligence, but as Bohr put it, physics is about what we can say about nature, not about nature itself. So if we recognize the wisdom in Bohr's remark, we see immediately that physics involves forming concepts about reality, and it is those concepts that require the insertion of a hypothetical consciousness, or a hypothetical observer if you prefer, in order to give meaning to the very language we rely on so completely to do physics.
So we might not hold that a real consciousness is required to collapse a wave function (we have so little idea of what consciousness is that it is not yet terribly informative to associate it with some physical process that we also have little idea about presently), yet we may still hold that the entire concept of collapse acquires meaning only in the context of the insertion of a hypothetical consciousness. This hypothetical observer is a device that is commonly used in physics, and can still be used in quantum mechanics, as long as we do not attribute it superhuman powers-- it must be privy only to the very same information that a human consciousness would actually be privy to were it actually present. It is in this sense that collapse still requires consciousness-- not to be physically actualized, but something even more fundamental than that: to have meaning in the first place.
 
  • #7
reynolds2 said:
I think that observer is an outdated term used in the early 1900 to describe something else entirly. When interaction takes place the system decoheres independent of observation (sloppy word). Thats why I asked if there were insances where wave functions collapse in nature naturally without human observers?

There is an interesting analysis of decoherence and realism by Bernard d’Espagnat in his books “Veiled Reality” and “On Physics and Philosophy”.

Within this analysis, decoherence does not restore realism because the outcomes of decoherence that we perceive are improper mixtures, not proper mixtures.

The mathematical formalism of decoherence theory predicts improper and proper mixtures, but the complexity of those proper mixtures renders measurement of them by humans to be impossible. In principle however they could be measured if, as part of our human make up, we had the ability to perform the measurements in question. If a hypothetical alien being with vastly superior abilities than we process were capable of doing such measurements, then they would not agree with our perception of reality as involving only the improper mixtures, they would maintain that their reality consists of all the outcomes of decoherence theory, proper and improper mixtures.

Thus just as quantum mechanics is inherently weakly objective (i.e. the observer is an essential part of the conclusion) so decoherence theory is also inherently weakly objective (i.e. the abilities (or lack of them)) of humans are an essential part of the conclusion.

So it seems that the formalism suggests that whilst decoherence itself may not require an observer, the observer is actually an integral part of what the interaction with the environment actually decoheres to in terms of our reality.

For a positivist, none of this is of any consequence, but for a realist who maintains that decoherence removes the observer then it is I think of considerable consequence.

I don't understand the technicalities of all of this, but I have come to see the philosophical value of the analysis in terms of human observer dependence.
 
  • #8
Decoherence does not solve the measurement problem in any way. Any interaction with the environment just creates a complex superposition. Not a mixture.
 
  • #9
StevieTNZ said:
Decoherence does not solve the measurement problem in any way. Any interaction with the environment just creates a complex superposition. Not a mixture.

I never said decoherence solves the measurement problem I am simply saying (or at least d'Espagnat is saying) that the outcomes are not independent of our involvement - decoherence does not provide a philosophical realist framework in terms of outcomes (i.e outcomes that can be thought of as existing independently of our involvement).

As far as mixtures are concerned, I thought a mixture is what essentially is arrived at through decoherence, i.e a particle in a superposition of A and B, but upon interacting with the environment, changes to a mixture of state A or state B. D'Espagnat simply states that is an improper mixture and that the proper mixture cannot be determed by humans.

Anyway, this is how I understand things from d'Espagnat:

Looking very closely at d’Espagnat’s writings, I am sure that he quite
deliberately connects the model of decoherence with the aptitudes of human
beings. He demonstrates this connection by means of a system involving an
electron interacting with a molecule. In terms of this experiment, a human
observer would see an improper mixture of the electron in state A or in
state B. The quantum formalism predicts many more outcomes, but the
complexity of the electron interacting with the molecule is so enormous that
we are unable to carry out the measurements – the only thing we could
measure would be “trivial” (d’Espagnat’s terminology) measurements
corresponding to the electron being in state A or state B at the detector.
Thus, from our perspective, we can legitimately say that our improper
mixture is essentially (but not accurately) a proper one, since the
predicted outcomes that are part of the proper mixture cannot ever be
measured by us. However, if we employ the services of a “super-physicist” (d’Espagnat’s
terminology), then from their perspective they would not agree with our
conclusion – they would quite forcefully state that our improper mixture
cannot be considered to be very similar to the proper mixture, since they
have performed all the measurements, not just those of the electron being in
state A or state B. For the "super-physicist", they have at their disposal a proper mixture
 
Last edited:
  • #10
I have some problems understanding the Copenhagen Interpretation. Consider the following modified 2-slit experiment. We have a low intensity electron source, and some obstruction with 2 slits in it, and then a screen. Whenever an electron hits the screen, a dot appears, but we suppose that we can remove a dot if we want to. Now we put a detector behind only one of the slits (say, slit A). When a dot appears on the screen, we check whether the detector saw it. Now we remove the dot from the screen if the detector did detect an electron coming through slit A.

The question is: Does an interference pattern develop from the dots we don't touch?

We could argue 2 ways:

1) All the dots on the screen are from electrons whose wave functions have not been disturbed by the detector, since nothing was seen by the detector. So we expect an interference pattern.

or:

2) Since the detector did not detect the electrons that caused the remaining dots, we know through which slit they went, nameley the other slit B which has no detector behind it. So we know the path the electrons have taken, and no interference pattern will appear.

If I have understood the Feynman Lectures Vol. 3 correctly, the second argumentation is the correct one. But this seems to indicate that the Copenhagen Interpretation is not quite right, because that requires some kind of interaction for the wavefunction to collapse, and there hasn't been one.

Could someone please shed some light on this?
 
  • #11
Your point #2 is indeed correct, but the CI has no problem with it. CI treats the wavefunction as a calculational tool, which works alongside of any other information you have in the problem. So collapse of the wavefunction happens automatically any time you have information that requires it to be collapsed, there is no need to account for any physical interaction. Indeed this is one of the features of the CI, not a problem with it. For example, if I have a particle in a box, and I shine a super-bright light pulse on one half of the box, and see no sign of the particle, then I know the particle is on the other side. I instantly update its wavefunction to reflect that information-- even though no light interacted with the particle in any direct or empirically establishable way. That is a form of collapse, but mostly it is a change in the information I have about the system. The same holds for when you put a detector near one slit, and get no detection-- that is what is known as "which way information", regardless of whether or not the detector registers a detection.
 
  • #12
Len M said:
As far as mixtures are concerned, I thought a mixture is what essentially is arrived at through decoherence, i.e a particle in a superposition of A and B, but upon interacting with the environment, changes to a mixture of state A or state B. D'Espagnat simply states that is an improper mixture and that the proper mixture cannot be determed by humans.
What does D'Espagnat mean by a proper mixture, that is different from a superposition? Maybe he means that a projection from a superposition onto a subspace is a proper mixture, but taking that projection and grouping all the outcomes of the macroscopic pointer together as if they were all the same state of the subsystem is an improper mixture. Is that what he means? If so, then an improper mixture has to do with how we are choosing to treat the system. The problem is, everything in physics is how we are choosing to treat the system-- so it's not clear his "superphysicist" isn't doing something different from what we would call physics, to the extent that we don't know all the other differences that might appear if one really could do physics that way. I think that's the objection that Bohr might have raised.
 
  • #13
Ken G said:
What does D'Espagnat mean by a proper mixture, that is different from a superposition? Maybe he means that a projection from a superposition onto a subspace is a proper mixture, but taking that projection and grouping all the outcomes of the macroscopic pointer together as if they were all the same state of the subsystem is an improper mixture. Is that what he means? If so, then an improper mixture has to do with how we are choosing to treat the system. The problem is, everything in physics is how we are choosing to treat the system-- so it's not clear his "superphysicist" isn't doing something different from what we would call physics, to the extent that we don't know all the other differences that might appear if one really could do physics that way. I think that's the objection that Bohr might have raised.

I’m not sure I would use the term “choose” here. It seems to be a case of having no choice in the matter – we are not choosing an improper mixture of A and B, that’s all we can actually do. The “super physicist” (and by this d’Espagnat is referring to a “being” that transcends our abilities) on the other hand would be able to measure all of the predicted outcomes, so for them their mixture is different to ours.

The only point of d’Espagnat’s argument is to establish that decoherence does not restore realism, as some would have it. The outcomes of decoherence are linked to the abilities (or lack of them) of us as observers.

I think I understand your point though - the physics done by us is the same as that done by the “super physicist”, the only difference being the ability of the “super physicist” to measure all of the predicted outcomes. From this vantage point, the physics in both cases involve an intelligence.

I think d’Espagnet is simply providing a means to show that decoherence cannot be taken on board by realists as defence of their philosophy, because the formalism of decoherence points to differing outcomes from a “super physicist” and from that of humans – the mixture we have access to (the improper mixture) cannot be thought of as being in that form independently of our involvement, rather it refers to our involvement. You personally perhaps might say that in any event, realists could not take decoherence as a defence of their philosophy because all of physics stems from the observer and intelligence.

My point of posting this was to refute an earlier post that suggested the process of decoherence was independent of us as human observers and practitioners. I take your point (and agree with) that no physics is independent of the observer, it’s just that d’Espagnat simply takes this to a level occupied by realists who would dispute the notion that all physics is observer dependent – they may dispute that notion, but they cannot dispute that the formalism of decoherence refers to humans, it is not independent of us.

As far as what d’ESpagnat means by a proper mixture, I’m not sure about your interpretation. So just for the record, and apologies for the length of it, my understanding of his reasoning is as below and is taken from his book “on Physics and Philosophy”. Note that my usage of the word "choice" here is invoked in the simple case of an electron and atom wherby I can choose not to do all of the measurements, but in the more complex case of electron-atom-molecule, I no longer have any choice - I am unable to do all of the measurements- it is impossible.

In terms of an ensemble E of N systems consisting of an electron- atom composition, we can express the state of each system as being of the form aA + bB. On this pure case, the quantum formalism can inform us of the amount of different measurements that we could perform on the electron-atom composition other than trivial measurements such as “position”.

Now, in addition to the just described ensemble E, we also consider an ensemble E' consisting of strictly proper mixtures (d’espagnat is quite specific here in distinguishing proper from improper), of which N/2 systems are in state aA and N/2 systems are in bB.

What can now be said about the similarities between ensembles E and E’? If, on ensemble E we were to forget about all possible measurements other than the trivial ones, then we can say that E is a very good approximation to E' and thus we can say that E can be considered to be an improper mixture of N/2 systems in state aA and N/2 systems in state bB. If however we think that the quantum predictions concerning everything that could be
measured are all correct, we cannot consider E to be a mixture in the manner we describe E’.

Thus we have a notion of invoking the observer (in this simple electron-atom case, through choice) as a means of establishing the final status of ensemble E. If we ignore all possible measurements (apart from the trivial ones), then E becomes a mixture in the form of ensemble E’, but if we take on board all of the possible measurements that we can carry
out, then E is not a mixture in the manner of E’.

D’Espagnat now takes this a step further in that he examines an ensemble consisting of N electron-atom-molecule systems. When we apply the quantum formalism to this ensemble, it is the case that we can deliberately ignore the practical and possible measurements (other than the trivial ones) on the atom-electron system, but we have no choice but to ignore all of the predicted measurements on the atom-molecule system since we will never, ever be able to measure them given the degree of complexity involved. This does not imply that in principle, the measurements could not be done, rather it implies that no humans will ever be able to do them. So, in terms of the abilities of humans, the ensemble of electron-atom-molecule systems residing in a pure state of aAA’ + bBB’ (where A’ and B’are the large molecules) will be transformed via decoherence into a mixture of N/2 states of aA and N/2 of states bB, providing we reference these outcomes specifically to our inabilities in carrying out the predicted very complex measurements.
 
  • #14
Len M said:
As far as mixtures are concerned, I thought a mixture is what essentially is arrived at through decoherence, i.e a particle in a superposition of A and B, but upon interacting with the environment, changes to a mixture of state A or state B. D'Espagnat simply states that is an improper mixture and that the proper mixture cannot be determed by humans.

StevieTNZ said:
Decoherence does not solve the measurement problem in any way. Any interaction with the environment just creates a complex superposition. Not a mixture.

Again, it just creates a complex superposition, not a mixture.
 
  • #15
Len M said:
I’m not sure I would use the term “choose” here.
But there are always choices. First we choose the kinds of things we will measure, like position and energy, which is really a choice in how we will measure them (the rest is just labeling).
We have found certain types of things are "empirical", and that is what we have chosen to base physics on. Those choices begin to shape what physics is, it doesn't somehow exist independently of them. It's true we cannot choose to measure what we cannot measure, but we can, and do, choose to measure what we can measure, and that's what gives us quantum mechanics. This is a point I feel Bohr was absolutely right about-- our fingerprints are all over quantum mechanics, and so we must not be surprised when we encounter observer dependent effects-- they are the effects of our choice to be empirical.
The “super physicist” (and by this d’Espagnat is referring to a “being” that transcends our abilities) on the other hand would be able to measure all of the predicted outcomes, so for them their mixture is different to ours.
I feel the "super physicist" is a problematic construction. This physicist is making fundamentally different choices about what to track, what to care about, and what to ignore, what to average over. They are not doing physics better than us, they are doing different physics. We have no idea what physics would look like for them-- for example, we have no idea if quantum mechanics would work. This is the problem that people had in Newton's day when they thought the universe was completely deterministic-- they imagined a "super physicist" that could measure the exact position and velocity of every particle, and had the computing power to extrapolate forward Newton's laws for an arbitrary amount of time, and predict everything that happens-- only trouble is, Newton's laws emerged from making the choice not to do exactly that, and Newton's laws simply don't work if you do try to do that-- Newton's laws are not the laws of the "super physicist." So why should quantum mechanical laws be? We always make the mistake of thinking the laws we got from certain choices of how to do physics would still work if we made different choices.
The only point of d’Espagnat’s argument is to establish that decoherence does not restore realism, as some would have it. The outcomes of decoherence are linked to the abilities (or lack of them) of us as observers.
Yes, that I agree with completely. I'm just saying that our choices are also linked to our abilities, and so is physics itself. There's really no such thing as decoherence as a physical truth, decoherence is a choice about how we do physics-- we have chosen what not to track, and that choice leads to what we call decoherence. In actual truth, it seems everything is coherence, there's no "de" until we make choices the "super physicist" is not making. I'm just saying if we don't make those choices, we have something more than just different observations-- we have different physics. (A classical example of this same effect is the second law of thermodynamics, which applies even when all the fundamental laws of physics are completely time reversible, so the second law stems not from the world itself but from our choices about how to describe the world-- what to track, what we choose to care about. D'Espagnat's "super physicist" has no second law of thermodynamics, so that's something different from physics as we know it.)
I think I understand your point though - the physics done by us is the same as that done by the “super physicist”, the only difference being the ability of the “super physicist” to measure all of the predicted outcomes. From this vantage point, the physics in both cases involve an intelligence.
More than that-- the physics is determined by the intelligence, and a vastly different intelligence, making fundamentally different choices about what to care about, ends up with a vastly different physics. We can ask what kind of "physics" an artist commune would come up with, for example-- you might argue it wouldn't be physics at all, but that's just my point.
I think d’Espagnet is simply providing a means to show that decoherence cannot be taken on board by realists as defence of their philosophy, because the formalism of decoherence points to differing outcomes from a “super physicist” and from that of humans – the mixture we have access to (the improper mixture) cannot be thought of as being in that form independently of our involvement, rather it refers to our involvement.
Yes, I agree with him there, I am noting the connection between the words "involvement" and "choices of what to care about." When we do that, the whole concept of the "super physicist" becomes questionable, but we can say it is merely a device to make a similar point. I do think it is quite important to recognize, though, that we would not view that person as a "super" physicist, we would view them as a bad physicist, or at least a different one. They wouldn't have anything to tell us that we would understand, their physics would sound excruciatingly tedious when passed through our different choices about what to care about-- like if you were reading a suspense novel and in the pivotal scene the author decided to describe in painstaking detail the location of every particle of dust in the room as the bad guy was chasing the heroine with a knife!
You personally perhaps might say that in any event, realists could not take decoherence as a defence of their philosophy because all of physics stems from the observer and intelligence.
Actually, I would say that this is why realists can take decoherence as a defence of their philosophy-- they just couldn't take it as a defence of the absence of a need for philosophy, or the absence of a need for making choices. I would say the choices a realist makes only work because of decoherence, but that doesn't make them "right," because a choice is something kind of opposite of being right. Physics involves a lot of intentional choices to be wrong-- but wrong about the things that do not seem to matter to us. I would say that is not a bad definition of physics-- the careful separation of things to be wrong about from things to be right about so as to achieve some simplifying purpose.
My point of posting this was to refute an earlier post that suggested the process of decoherence was independent of us as human observers and practitioners.
Yes, and compared to that overarching point, I am perhaps nitpicking, but I feel it is an important distinction to make. We must not embrace the concept of observer dependence without also embracing all of the ways we leave our fingerprints on our physics.
In terms of an ensemble E of N systems consisting of an electron- atom composition, we can express the state of each system as being of the form aA + bB.
Not to belabor the point, but a more correct way to say that is we can choose to express the state that way-- that is not really going to be the state, we already know that, but we can't do physics theory on the real state, our only access to the real state is via observation and even that requires making pertinent choices.
Now, in addition to the just described ensemble E, we also consider an ensemble E' consisting of strictly proper mixtures (d’espagnat is quite specific here in distinguishing proper from improper), of which N/2 systems are in state aA and N/2 systems are in bB.
OK, that's what he means by a proper mixture, but that brings up a problem-- normally our ensembles are of identical particles. Identical particles don't make proper mixtures, not even for "super physicists", because they are indistinguishable. This is quite important, it is the reason we have a periodic table of elements-- carbon acts much differently than oxygen because the extra electrons in oxygen couldn't be in the same states as the electrons in carbon, and the reason they couldn't is because they are indistinguishable from each other and indistinguishable Fermions (like electrons) cannot be in the same state. The indistinguishability means that all the electrons respond to a single wave function-- they don't have their own wave functions. But we usually make the choice to treat them as if they had their own wave functions, because it makes our physics doable. This is just an example of how problematic the "super physicist" construct is, it's not formally consistent with how physics works. In summary, what I'm saying is that his points about an "improper mixture" are valid, but can be addressed simply by pointing to how this concept emerges from how we choose to do physics. There is not actually any such thing as a "proper mixture" that a "super physicist" would have access to, which speaks to the dangers of assuming we know anything about what the "super physicist" would perceive. It cuts to the heart of interpretations of quantum mechanics-- some hold that the wave function is real, and is accessible to "super physicists" (superpositions, not proper mixtures), others hold that the wave function is a tool for making calculations by physicists who have made the choice to accept something different from "super." As you can tell, I tend to hold to the latter camp.
D’Espagnat now takes this a step further in that he examines an ensemble consisting of N electron-atom-molecule systems. When we apply the quantum formalism to this ensemble, it is the case that we can deliberately ignore the practical and possible measurements (other than the trivial ones) on the atom-electron system, but we have no choice but to ignore all of the predicted measurements on the atom-molecule system since we will never, ever be able to measure them given the degree of complexity involved.
Perhaps here we are saying something similar in different ways. If someone holds gun to my head and says "give me all your money", I might say I have "no choice" but to do it. Or, I might simply say I am making the choice to do it because it is clearly the best option I have available! The latter is more how I am talking about our physics, but I think the main point is the same-- we do physics a certain way, and that way determines how our physics comes out. This is why Bohr said that physics is not about nature, it is about what we can say about nature. Thus the observer dependence is much more fundamental than is understood by those who try to rule out some kind of definite physical influence of the participation of a conscious mind, as in the linked paper above. I believe that is what you are saying D'Espagnat is also arguing-- I just don't like the "super physicist" device, because I think it is being used to establish a truth about physics by essentially ignoring that same truth!
This does not imply that in principle, the measurements could not be done, rather it implies that no humans will ever be able to do them.
The more important issue is that no humans would ever want to do them, as they would not be doing a useful form of physics and might well end up with something that looks completely different than anything we would recognize as physics.
So, in terms of the abilities of humans, the ensemble of electron-atom-molecule systems residing in a pure state of aAA’ + bBB’ (where A’ and B’are the large molecules) will be transformed via decoherence into a mixture of N/2 states of aA and N/2 of states bB, providing we reference these outcomes specifically to our inabilities in carrying out the predicted very complex measurements.
This is essentially the von Neumann approach to describing measurement theory, which is the best we have, but since it has never been tested, it is not at all clear if we can really take the wave function of macroscopic systems that literally. It is what the formalism of quantum mechanics says, if we adopt the idealization that a macroscopic system can be isolated from its environment, but even then it is just like the way the formalism of Newtonian mechanics said particles have definite positions and momenta. I'm not sure if there is any physical significance in noting that this is what the formalism of quantum mechanics says. In other words, once we have agreed that the quantum mechanics we test is a quantum mechanics of improper mixtures, it is somewhat inconsistent to imagine that the "reality" of quantum mechanics deals with anything but improper mixtures. Call it my empiricism peeking out.
 
Last edited:
  • #16
StevieTNZ said:
Again, it just creates a complex superposition, not a mixture.
This is the distinction between a "proper mixture" and an "improper mixture". The complex superposition you refer to is treated by our physics as an "improper mixture." That's also what I mean about "choices" we make, we have a theory that seems to give us complex superpositions, but we don't use or test those, we project onto a tiny subspace of the "full reality" and this gives us a concept of an improper mixture, and that's the only physics we've ever tested or indeed ever want. We agree that there is actually no such thing as a "proper mixture" in quantum mechanics, and so we don't think D'Espagnat needs that concept to make his point-- it suffices to understand what an "improper mixture" is, and just use the term "mixture" for it. Hence we agree with D'Espagnat's core thesis-- by dropping the word "improper", we have entered into an idealization that allows realism to work, but at the cost of throwing out the reality. This shouldn't surprise us at all-- the first step of realism is always replacing reality with something else. If that seems ironic, then we need to think more deeply about how physics, and indeed human intelligence, works in the first place!
 
  • #17
StevieTNZ said:
Again, it just creates a complex superposition, not a mixture.


Well I can’t really argue properly about this, but to the best of my current understanding of the decoherence phenomenon as outlined by d’Espagnat, decoherence is fundamental to mixtures. He says:

Bernard d’Espagnat said:
…. In other words, to the “just human” physicist, ensemble E appears as if it were a “real” mixture in the strong(i.e. realist) sense of the epithet.(For this reason it has indeed been called a mixture but only an “improper” one). Basically this is the essence of the decoherence phenomenon (so named by reference to the “coherence” that quantum states of the Aa+Bb type are said to possess).
If now we get back to the initial problem, the one relative to pointers, we observe that such a decoherence does indeed take place in them, for it is never the case that pointers – nor, quite generally, macroscopic systems – are absolutely isolated. They interact with other systems same as our “atoms” and “molecules”…..
…..It was found (Zeh 1970; Joos and Zeh1985) that in this respect, even a speck of dust lying far away in interstellar space is not totally isolated due to its interaction with cosmic background radiation. Now, the complexity of this is fabulous. It makes it strictly impossible that we should be able to perform, on the electron-pointer-environment composite system analogous to the ones that we should have to make if we aimed at experimentally disproving representation R on the electron- atom-molecule triplet measurements...

(my note: Representation R refers to the impression we get that the measurements performed on the ensemble E triplet are a good representation of the ensemble E’ triplet because we ignore all but the trivial measurements of the triplet. (see my post 13 above that outlines E and E’). )

...In other terms, from an observational (or operational) point of view practically everything takes place as if the above defined ensemble were a proper mixture.
Correlatively, everything takes place as if both the ensemble of pointers and the one of electrons were (correlated) proper mixtures, the first one of the pointers in state A and pointers in state B, the other one of electrons in “state a or in “state” b. In still other terms, whatever measurements (not involving the environment) we might imagine to perform, subsequently to the electron-instrument interaction process, on the electron-pointer pairs, everything will look as if the instrument had set the electrons either in “state”a or in “state” b, thus turning the initial electron ensemble into a mixture.
(My bold, only for highlighting d'Espagnat's reference to the term “mixtures” in relation to decoherence)


I did have some correspondence with Erich Joos (who was one of the pioneers of decoherence theory) in order to try and understand d’Espagnat’s reasoning. As best I could understand, he seemed to agree with d'Espagnat's reasoning of proper and improper mixtures via decoherence, though he emphasised that the process of decoherence is not dependent on human abilities. That point I can see, d’Espagnat is talking more about the involvement of humans in being able to (or not) measure all of the predicted outcomes. Because of our inabilities to perform all of the measurements, the mixture is an improper one. This was just a brief part of what he said to me, I only include it because he also describes our perception of state A or state B as an improper mixture.

From Erich Joos:
The superposition principle allows "non-classical" states of a stone being
here and there at the same time. Our experience tells us otherwise.
From a physical point of view, what is the difference between (a) a stone
being here (b) a stone in state "here+there" ? It is dynamical stability.
Case (b) is "unstable" in the sense, that the coupling to the environment
immediately leads to a state where both components are global.
For a local observer, then, the stone always appears to be either here or
there (the improper mixture). The difference between (a) and (b) is a
property of the interactions going on and the assumption of local observers
(the latter is an immediate part of our experience).”


My bold, just for reference to the term “mixtures”.

Now it is perfectly possible that I am not fully understanding d’ Espagnat’s reasoning and that my use of the term "proper mixtures" in terms of decoherence is not what d’Espagnat intended to convey. But you seem to go even further than this in stating that mixtures (proper or improper) do not evolve through any part of decoherence theory. I thought I had begun to understand d’Espagnat reasonably well, so if you are stating (with authority) that decoherence does not invoke mixtures at all, then something is amiss with my understanding somewhere. If this is the case, it would be of use to me if you could you give me a rough idea as to where I may be going wrong in my understanding here so that I can re-read his account from your perspective.

Edit to add: I see that whilst I have been writing this Ken G has provided a very comprehensive reply to my post and also commented on the post from StevieTNZ that I have just replied to. There is much to read there that may have changed my perspective and questions in this post, but I will leave it anyway - this post obviously relates to what I thought before I read ken's detailed post.
 
Last edited:
  • #18
Ken G said:
In summary, what I'm saying is that his points about an "improper mixture" are valid, but can be addressed simply by pointing to how this concept emerges from how we choose to do physics. There is not actually any such thing as a "proper mixture" that a "super physicist" would have access to, which speaks to the dangers of assuming we know anything about what the "super physicist" would perceive. It cuts to the heart of interpretations of quantum mechanics-- some hold that the wave function is real, and is accessible to "super physicists" (superpositions, not proper mixtures), others hold that the wave function is a tool for making calculations by physicists who have made the choice to accept something different from "super." As you can tell, I tend to hold to the latter camp.

Thanks for that very informative reply.

To be fair to d’Espagant, the idea of a “super physicist” is contained his book “On Physics and philosophy”, a book, that whilst carries a very thorough analysis of physics and reality, was purposely written in a manner that made it accessible to the interested educated non specialist, so I wonder if I haven’t really done justice to his arguments on a forum like this if I can only really relate to his verbal “layman” type explanations. I’m sure if he were writing here, the term super physicist would not be part of his argument. So in this sense, whilst I take on board everything you say, I can’t be sure if your objections to the “super physicist” concept are true to what d’Espagnat is properly referring too.

He has a precursor to “On Physics and Philosophy”- a book called “Veiled Reality”, but this book is quite technical and one that I find difficult to follow at times. In this book no mention of a “super physicist” is made, but the issue of “sensitive observables” is extensively discussed in relation to decoherence. He outlines “sensitive observables” as measurements that can be performed in principle (there is no law of physics that forbids them and the sequence of operations by means of which they would be made can be quite precisely stated) but cannot be done in practice because they are tremendously too complicated (they would induce a sudden, appreciable increase of the entropy of the whole Earth for example).

Within this book there is a large section on decoherence (when the book was written it seems to have been referred to as the environment theory) and here he discusses the notion of sensitive observables as an argument against reconciling quantum mechanics with conventional realism via the environment theory (decoherence theory as it is known today).

So again just for the record, I would like to show an extract from this more technical book that I think outlines his concept of the predicted unmeasurable “sensitive observables” that renders decoherence theory as being weakly objective (i.e. the theory refers to the abilities of humans).

d’Espagnat in “Veiled Reality” page 343 said:
Within a Gibbsian ensemble of systems and instruments (and environments) let a measurement like process be considered, in which the measured system S initially lies in a quantum superposition of several – say two-eigenstates of the measured observable and let t be some time after this process has come to a close and the interaction between the instrument and the environment has produced the effects analyzed by the theory. There then exist some operators that correspond to in-principle observable quantities and have the property that the quantum predictions concerning the outcomes of possible measurements of these quantities, to be performed at time t or later, are incompatible with the assumption that the instrument pointers are in definite macrostates. It is true that these quantities- “the sensitive observables” involve the environment in a way that makes them unmeasurable in practice. But this expression “in practice” clearly refers to limitations of the human abilities, and cannot therefore be part of the wording of any condition that makes a statement strongly objective.

He then goes on to discuss this verbal description in terms of the technical formalism, and finishes off by saying:

d’Espagnat in “Veiled Reality” page 345 said:
The only reason that could be invoked for discarding “sensitive observables” would, again, be that they cannot be measured in practice (perhaps because the limited lifetime of the universe will anyhow prevent “us” – i.e. humanity- from doing so). But, as stressed earlier, while this is a very good reason for considering that the environment theory accounts extremely well for the phenomena, it is not one that can be seen as making it strongly objective.

So I am not sure about the “Super Physicist” bit now, I may be using this term in a way that does not properly represent d’Espagnat’s reasoning when going deeper than his “layperson’s” account that I have previously summarised. But in any event, I think your arguments and those of d’Espagnat converge to a view of reality that is never disconnected from our actions.
 
  • #19
I think what he is basically doing is critiquing the "super physicist" concept, so in a sense it is a straw man he is creating. He seems to be saying that even if the super physicist concept made sense, and applicable to quantum mechanics formalism (the part I questioned), then even so there would not be a recovery of "strong objectivity" (which he seems to be associating with strong realism). I don't disagree with that, I'm just saying that even if the super physicist concept could restore strong objectivity in quantum mechanics, I wouldn't associate any importance to it-- quantum mechanics is not a theory done by super physicists, and was not invented to predict "sensitive observations", it is a theory created and checked by us, on the kinds of observations we actually do. So we can call his argument an "answer to the rationalists", which is fine, but I would generally view the rationalist position as difficult to take literally anyway-- as the history of physics exhibits all too clearly.
 

Related to Observer independent collapses?

1. What is an "observer independent collapse"?

An "observer independent collapse" refers to the idea in quantum mechanics that the superposition of a particle's states can collapse into a single state without the involvement of an observer or measurement.

2. How does the concept of "observer independent collapse" differ from the Copenhagen interpretation?

The Copenhagen interpretation states that the collapse of a particle's superposition occurs only when an observer makes a measurement. In contrast, the idea of "observer independent collapse" suggests that the collapse can occur without an observer or measurement.

3. Is there evidence to support the concept of "observer independent collapse"?

There is currently no direct evidence for "observer independent collapse." Some interpretations of quantum mechanics, such as the Many Worlds Interpretation, do not require a collapse of the wave function at all. However, there is ongoing research and debate about the nature of quantum collapse and its relationship to observation.

4. How does "observer independent collapse" impact our understanding of reality?

The concept of "observer independent collapse" challenges our intuitive understanding of reality and the role of observation in shaping it. It suggests that the act of observation may not be necessary for physical events to occur, and raises questions about the nature of consciousness and its relationship to the physical world.

5. What are the implications of "observer independent collapse" for future technological advancements?

The concept of "observer independent collapse" has potential implications for future technologies, such as quantum computing, which rely on the principles of quantum mechanics. Understanding the nature of quantum collapse could lead to new ways of manipulating and controlling quantum systems, potentially unlocking new possibilities for technology and innovation.

Similar threads

Replies
23
Views
2K
Replies
7
Views
1K
Replies
8
Views
2K
Replies
4
Views
830
  • Quantum Physics
3
Replies
71
Views
4K
Replies
1
Views
870
Replies
3
Views
1K
  • Quantum Physics
2
Replies
64
Views
8K
Replies
4
Views
878
  • Beyond the Standard Models
Replies
5
Views
285
Back
Top