Implications of quantum foundations on interpretations of relativity

In summary: Einstein initially didn't like the spacetime interpretation, but he later embraced it in his formulation of general theory of relativity.
  • #36
Demystifier said:
What about block universe? Is that a consequence or an interpretation?
An interpretation. In interpretations with a preferred frame, that preferred frame also defines the presence objectively, and the relativity of simultaneity is reduced to an impossibility to identify the preferred frame by local observations.
martinbn said:
How is that specific to relativity? It seems like a general philosophical position. In fact it seems very non-relativistic in spirit. What is present in relativity? A choice of simultaneity convention? Which one?
A philosophical position that assumes a block universe exists too, it is named fatalism. In fatalism, the future is predefined, thus, already existing in the same way as the present. In what I would simply name common sense, the future, as well as the past, have a different status, only what is present exists.

This difference is an objective one, a property of the world, not of observations of the world. Once the preferred frame cannot be identified by observation, it cannot be a choice by an observer. The observer can only guess which is the correct preferred frame (and the CMBR frame gives a quite plausible guess).

The preferred frame interpretations are, indeed, very non-relativistic in spirit. Relativistic symmetry holds only for some observable effects, it is not a fundamental symmetry, and in particular not a symmetry of space and time. This is what makes them much better compatible with similarly non-relativistic interpretations of quantum theory.

A class of interpretations of QT which depends on a preferred frame for extensions into the relativistic domain can be easily identified: If we look at the Schrödinger equation in the configuration space, it gives a continuity equation for the density ##\rho(q)##:
$$\partial_t \rho(q,t) + \partial_i ( \rho(q,t)v^i(q,t)) = 0. $$

All one needs is to give the corresponding ##\rho(q,t)v^i(q,t)## a physical interpretation, as a probability flow.
 
  • Like
Likes physika
Physics news on Phys.org
  • #37
Demystifier said:
What about block universe? Is that a consequence or an interpretation?
A consequence. The block universe has always seemed, to me, a consequence of pre-Minkowski classical physics, which describes time as a fourth dimension. Nothing in SR (or GR) changes this.

I don't get your point 4, though. There is no preferred foliation for time after Einstein and Minkowski, and no preferred foliation problem, just as there is no preferred basis problem in quantum mechanics.
 
  • #38
martinbn said:
How is that specific to relativity? It seems like a general philosophical position. In fact it seems very non-relativistic in spirit. What is present in relativity? A choice of simultaneity convention? Which one?
Just as there is no preferred position, so there is no preferred time. Seems entirely relativistic to me.
 
  • #39
Bell is also famous for the discovery of anomalies in relativistic quantum field theories (particularly the ##\mathrm{U}(1)_{\text{A}}## anomaly, known as the Adler-Bell-Jackiw anomaly).
 
  • #40
I don't think that it makes much of a difference whether you use photons or massive particles to check Bell's local deterministic HV result against QT results. Entanglement is a very universal phenomenon, for which it doesn't make a lot of a difference in which concrete way you realize it. That there are so many Bell tests using photons is simply, because it's technically easier to prepare entangled states with photon Fock states and keep them unperturbed by interactions with "the environment".

Concerning the other questions, it's clear that one must be careful how to think about photons. First of all you cannot divide photons. Of course you can have a process like parametric down conversion were a laser photon interacting with a BBO crystal splits into an entangled photon pair but that's not the split of the original photon but you get two photons with about half the frequency and corresponding wave numbers (fulfilling the phase-matching condition for the wanted preparation of a biphotonic Bell state).

Photons have no wave functions since they do not have a position observable. A photon is by definition a single-quantum Fock state of the electromagnetic field. A true photon state must also be normalizable, i.e., it has a finite width in energy and momentum.

What happens with a photon in an experiment depends of course on its setup. E.g., if you want to make a polarization measurement you can just use a polarization filter and a photodetector behind it. The (ideal(ized)) polarization filter either absorbs the photon or let's it through, with probabilities depending on the polarization state of the incoming photon and the direction of the polarization filter, given by Born's rule. The photons coming through have the corresponding linear polarization determined by the orientation of the polarization filter. The (idealized) polarization filter is in this case described by a corresponding projection operator, which you can interpret in a FAPP sense as a "collapse". I'd simply call it filtering ;-).

Another possibility is to use a (idealized non-absorbing) birefringent crystal. Then the photon is deflected in different directions with probabilities again given by the incoming photon state and the orientation of the crystal, preparing states which are a superposition of the two possible outcomes leading to an entanglement between the polarization and the momentum of the photon. Here the birefringent crystal can be formally described by a unitary operator. Here I think most collapse proponents would not call this a collapse, because what's prepared is a superposition and which polarization state and momentum an individual photon has taken when going through the crystal must be subsequently measured with photodetectors placed at positions to measure the momentum of the photon, and the collapse proponents then call this a collapse, though of course you don't have a photon left, because it's simply absorbed by the photodetector.
 
  • #41
Demystifier said:
the past, the presence and the future exist on an equal footing.

I would much appeciate your explaining the quote above in some detail. It seems to be quite ambiguous regarding the role of an observer.
 
  • #42
Buzz Bloom said:
I would much appeciate your explaining the quote above in some detail. It seems to be quite ambiguous regarding the role of an observer.
Why do you think so?
 
  • #43
maximus43 said:
Bell's "theoroms" only applies to particles with spin

No, Bell's theorem says nothing about spin. The particular example Bell used to show that QM's predictions violate Bell's theorem used the spin of a spin-1/2 particle, but that does not mean the proof of the theorem itself involves spin. It doesn't. It is much more general than that.

maximus43 said:
his theorems and his derived inequalities do not capture all of classical physics

They do in the only way that matters for the theorem: every classical theory of physics satisfies the premises of the theorem.

maximus43 said:
and collapse to "theories" when applied to classical theories that reject the integrity of the photon

First, I have no idea what you mean by "collapse to theories" here. Bell's theorem is not a theory of physics. It is a mathematical theorem that puts a limitation on the predictions of any theory of physics that satisfies its premises.

Second, of course there is no such thing as a "photon" in classical physics. That has nothing whatever to do with what Bell's theorem says about the possible predictions of any classical physics theory.

maximus43 said:
95 % of experiments do not use Bell inequalities

If you mean they don't use the particular form of the inequalities that Bell put in his paper, yes, this is true. Other forms of the inequalities turn out to be easier to compare with experimental data. But all such inequalities are still derived from the general form of Bell's theorem.

maximus43 said:
One interpretation of these results is that the the integrity of the photon should be questioned.

Another interpretation is that photon detectors in those earlier experiments were not accurate enough to give a meaningful test of whether the relevant inequalities were violated or not. As detectors become more accurate, the experiments give better tests, and those tests are making it clearer and clearer that the predictions of QM are valid and that the relevant inequalities are violated.

(Note, btw, that the objections to the term "photon" in the Lamb paper you reference, while they are worth considering--@vanhees71, for example, has expressed similar concerns in this thread as well as many other threads here on PF--have nothing to do with Bell inequality tests. Bell inequality tests are about observables, such as clicks in photodetectors; you don't have to adopt a "photon" interpretation of the underlying theory in order to evaluate those observables and how their measured values in experiments compare to Bell-type inequalities.)
 
  • Like
Likes mattt and vanhees71
  • #44
Demystifier said:the past, the presence and the future exist on an equal footing.
Buzz Bloom said:
It seems to be quite ambiguous regarding the role of an observer.
Demystifier said:
Why do you think so?

When I try to guess what you mean, I come up with the following.

Since the past is fixed, all events are facts. No conceptually possible alternative facts exist as a part of the past. I am guessing tha you mean that for the future to be on an equal footing there can be no alternative possibilities actually occurring in the future. That is, the events of future are completely deterministic.

The ambiguity which confuses me is that I also think you do not mean this because of the randomness of QM influencing alternative possibilities becoming the measurements of the future. For example, assuming the multi-world interpretation, when a measurement is made the observer exists in only one of two (or more) possible future worlds depending on the actual value of the measurement observed.

I can offer some other examples of the ambiguity if you think that would be helpful to your understanding of my confusion regarding what you intend.

Regards,
Buzz
 
  • #45
Buzz Bloom said:
assuming the multi-world interpretation

This is not a good choice for your argument since the MWI is deterministic; there is no randomness at all in the MWI.
 
  • #46
Buzz Bloom said:
Since the past is fixed, all events are facts.
The point is that future events are also fixed facts, according to the block-universe interpretation. And it doesn't require determinism, probabilistic laws are also compatible with that. In the lack of determinism, we cannot compute the future events from the present ones. But it doesn't change the fact that the future event will be what it will be. If in the future a random event A will happen, then it is a fact that A will happen. It will happen randomly, but if it will happen, then it will happen. I don't know if it makes sense to you, but that's the idea of block-universe interpretation. It's up to you to decide whether you like this interpretation or not.
 
  • #47
PeterDonis said:
(Note, btw, that the objections to the term "photon" in the Lamb paper you reference, while they are worth considering--@vanhees71, for example, has expressed similar concerns in this thread as well as many other threads here on PF--have nothing to do with Bell inequality tests. Bell inequality tests are about observables, such as clicks in photodetectors; you don't have to adopt a "photon" interpretation of the underlying theory in order to evaluate those observables and how their measured values in experiments compare to Bell-type inequalities.)
To clarify my point of view: I don't object against the use of photons. I only object against the bad habit to sell it in terms of "old quantum theory", i.e., Einstein's flawed point of view that photons can be qualitatively understood as if they were massless point-like particles. Einstein himself was very critical against his own "heuristic viewpoint", and as we know today, he was right in being sceptical against this mishmash of quantum and classical ideas.

A photon is a well-defined concept within modern relativistic local quantum field theory. It's part of the Standard model of elementary particle physics and as such has withstood many attempts to disprove it (to the dismay of many HEP physicists who look for "physics beyond the standard model", because it seems pretty clear that it's incomplete; at least it's likely that there are more particles, explaining the nature "dark matter", and some additional mechanism of CP violation to explain our very existence).

Also almost all tree-level results of QED (like the photoeffect, Compton scattering) are identical with the semiclassical approximation (electrons/charged particles quantized; em. field classical). The most simple effect that really needs the quantization of the em. field is spontaneous emission (discovered by Einstein in 1917 when rederiving Planck's black-body radiation law from the kinetic-theory viewpoint).

To give historical justice one should mention that already Jordan in the famous "Dreimännerarbeit" quantized the electromagnetic field within the scheme of "matrix quantum mechanics". At this very early time, however, most physicists disregarded the need for quantizing the em. field, mostly due to the fact that you come very far with the semiclassical theory. Usually today one quotes Dirac as the discoverer of field quantization and spontaneous emission, but he was somewhat later.
 
  • #48
PeterDonis said:
So what? We all agree that classical Maxwell electrodynamics works fine as an approximation. The 95% of experiments are those within the domain where that approximation works. The other 5% are not. And if we're talking about quantum foundations, as we are in this thread, approximations are irrelevant. Your theory needs to explain all the experimental results, not just 95% of them.
Indeed, and particularly everything related to entanglement and the violation of Bell's inequalities and all that cannot be explained by the semiclassical theory (charges quantized, em. field classical). Other examples is the HOM experiment and quantum beats. For a very good and pedagogical discussion, see

J. Garrison and R. Chiao, Quantum optics, Oxford University
Press, New York (2008),
https://doi.org/10.1093/acprof:oso/9780198508861.001.0001
 
  • #49
I had to delete some posts. Please remain focused on provable facts rather than opinions.

Also please refresh your browser. You are answering to posts which aren't visible anymore!
 
Last edited:
  • Sad
Likes PeroK
  • #50
PeterDonis said:
This is not a good choice for your argument since the MWI is deterministic; there is no randomness at all in the MWI.
Hi Peter:

I was not trying to make any argument. I am trying to describe my confusion regarding the post in which @Demystifier said "the past, the presence and the future exist on an equal footing." I just searched the entire thread, and I can not find the post in which Demystifier said this. The quote seems to have vanished, perhaps due to some recent editing.

I am now also confused by what you posted: "there is no randomness at all in the MWI." I may have misunderstood what I read in Wikipedia.
The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wavefunction collapse.[2] This implies that all possible outcomes of quantum measurements are physically realized in some "world" or universe.​
I interpret this to mean that an observer in one of the many worlds who makes a measurement (which has several or many possible values) will become a corresponding multiple of himself, each in a different world corresponding to a particular value being the result of the measurement. Therefore, each observer in one of the post-measurement produced worlds will be in a randomely chosen world of the many possibilities. If this is incorrect, would you please explain the correction.

Regards,
Buzz
 
Last edited:
  • #51
Buzz Bloom said:
I am now also confused by what you posted: "there is no randomness at all in the MWI." I may have misunderstood what I read in Wikipedia.

You should not be trying to understand QM in general, let alone the MWI, by reading Wikipedia.

We have had previous threads on this aspect of the MWI, and I'm pretty sure you were involved in at least one of them, though it might have been a while ago. If you want to rehash the issue again, it should be moved to a different thread.
 
Last edited:
  • Like
Likes vanhees71
  • #52
PeterDonis said:
We have had previous threads on this aspect of the MWI, and I'm pretty sure you were involved in at least one of them, though it might have been a while ago. If you want to rehash the issue again, it should be moved to a different thread.
Hi Peter:

At my advanced years I do forget things. I do not remember participating in a previous discussion of MWI.

What I would like to understand is whether or not I have misunderstood the Wikipedia text I quoted. If you think I should start a discussion of this topic in a new thread, I will do that. I also would like to understand the implication regarding randomness in my understanding it you find my understanding of the Wikipedia MWI article to be correct.

Regards,
Buzz
 
  • #53
Buzz Bloom said:
What I would like to understand is whether or not I have misunderstood the Wikipedia text I quoted. If you think I should start a discussion of this topic in a new thread, I will do that.

Yes, please do. It would be too far off topic in this one.

Also, your question should not be whether you have misunderstood the Wikipedia text; Wikipedia is not a good source for actually learning the physics. You really need to look at a QM textbook or paper that talks about the MWI.
 
  • #54
Hello. If I remember well Louis de Broglie wrote on his book about experiment of a particle in a box. Say we divide the box half and half and bring them to Tokyo and Paris . We will find a particle in Tokyo half or Paris half when opened. In this situation the particle, as source of spacetime curvature, change geometry in Tokyo or in Paris but it is not decided before opening. Can this be a case mentioning relation between quantum entanglement and GR ?
 
  • #55
mitochan said:
Can this be a case of relation between quantum entanglement and GR ?

No, because GR is not a quantum theory; there is no way in GR to represent a superposition of two different spacetime geometries, which is what the QM side of your thought experiment would require. We would need a quantum theory of gravity to model such an experiment.

(Note that a single particle's effect on the spacetime geometry would be many, many orders of magnitude too small to measure, now or for the foreseeable future; but it is possible to construct thought experiments where some kind of quantum uncertainty could lead to a superposition of possible positions for an object whose effect on spacetime geometry is measurable.)
 
  • Like
Likes PeroK
  • #56
Thanks. I observe a difficulty in the experiment by myself. Procedure of carrying half boxes would cause measurement of their inertia and collapse the wavefunction before their arrival.
 
Last edited:
  • #57
Buzz Bloom said:
the post in which @Demystifier said "the past, the presence and the future exist on an equal footing." I just searched the entire thread, and I can not find the post in which Demystifier said this. The quote seems to have vanished, perhaps due to some recent editing.
It's in the first post, item 2.
 
  • Like
Likes Buzz Bloom
  • #58
maximus43 said:
vanhees71 said:

What was Bell's opinion of QM"?

Barry
I'm not so sure about this. For me the great merit of Bell's idea is that he brought a pretty unsharp philosophical question about "reality" and the also pretty enigmatic ideas proposed in the (in)famous EPR paper (which Einstein himself didn't like too much) to a clear scientific empirically decidable question, namely whether with a local deterministic hidden-variable theory, starting from a clear mathematical definition of the statististical meaning of such a theory, all statistical predictions of quantum theory can be reproduced. The important point is that he could derive his famous inequality concerning measurements on ensembles, which holds within this class of local deterministic hidden-variable theories but are violated by the predictions of QT. In this way he found theoretically a general scheme, which allows it to decide whether or not a local hidden variable theory can always be constructed leading to the same statistical predictions as QT. I'm not sure, whether Bell expected QT to hold or the local determinstic hidde-variable theories.

At this time it was very difficult to realize such experiments, but there were experimentalists at the time who took up the challenge. The first being successful was Alan Aspect who prepared entangled photon pairs with a atomic cascade using a laser. That was a breakthrough in the preparation of entangled photon states, and he could successfully demonstrate the violation of the Bell inequality for a certain set of measurements on the polarization states of polarization-entangled photons and thus show that, within the uncertainty of the experiment, QT correctly predicts the correlations between the photon polarizations contradicting the predictions of any local deterministic hidden-variable theory:

https://en.wikipedia.org/wiki/Aspect's_experiment

Today the quantum opticians have much more efficient sources for entangled photons making use of non-linear optics possible with strong lasers: There you can produce entangled photon pairs in many kinds of entangled states at high rates, and the corresponding quantum-optics experiments became very accurate, confirming the violation of Bell's inequalities at very high confidence levels. Also many even more exciting experiments could be done, including "quantum eraser experiments, using postselection schemes a la Scully et al" (e.g., Kim et al), "quantum teleportation", "entanglement swapping" (e.g., Zeilinger et al).

Today this field of "quantum informatics" enters a phase, where you can use it for practical purposes, with applications like quantum cryptography and also quantum computing.
 
  • #59
Demystifier said:
It's in the first post, item 2.
Hi @Demystifier:

Thank you very much for for your response. From time to time my memory plays tricks on me. What I remember is that the item I quoted from was about MWI. I have no understanding at all of the "Spacetime interpretation".

Regards,
Buzz
 
  • #60
mitochan said:
Hello. If I remember well Louis de Broglie wrote on his book about experiment of a particle in a box. Say we divide the box half and half and bring them to Tokyo and Paris . We will find a particle in Tokyo half or Paris half when opened. In this situation the particle, as source of spacetime curvature, change geometry in Tokyo or in Paris but it is not decided before opening. Can this be a case mentioning relation between quantum entanglement and GR ?
Maybe you have something like this in mind?
"GR=QM? Well why not? Some of us already accept ER=EPR [1], so why not follow it toits logical conclusion?"
-- Susskind, https://arxiv.org/pdf/1708.03040.pdf

In a more speculative setting, I think there are very interesting possible "interpretations" of spacetime as well as the observer equivalence constraints, that are suggestive toward particular research directions for QG and unification.

In my preferred interpretation, one can not understand the constraints of neither SR nor GR, without also considering how spacetime emerges among interacting "observers". I am closest to an operational interpretation that was mentioned in the first post. The only problem of Einsteins derivation from the two postulates of (observer equivalence) and (invariant upper bound on speed) is that the postulates implicitly containts assumptions about spacetime. My "interpretation" would be to relax postulate one, and replace ti with observer democracy rather than equivalence. In this case the constraints becomes emergent, along with spacetime and matter. There are also many indications that upper bound on speeds follow naturally in information geometric constructions; so the second postulates is likely not needed either. I admit that in my own work I should probably work better to maintain a list of references, which is my I refrain from getting too deep. But these suggesttions have arised in several published places but different authors as well as from my own considerations. A random googling finds for example this, givign you a hint of the general idea, I didnt analyse that paper to depth, but it gets you in the ballpark...

Stochastic Time Evolution, Information Geometry, and the Cram ́er-Rao Boun
" As a consequence of the Cram ́er-Rao bound, we findthat the rate of change of the average of any observable is bounded from above by its variance times thetemporal Fisher information. As a consequence of this bound, we obtain a speed limit on the evolution of stochastic observables: Changing the average of an observable requires a minimum amount of time givenby the change in the average squared, divided by the fluctuations of the observable times thethermodynamic cost of the transformation.
"
- - https://arxiv.org/abs/1810.06832

If you consider a true _intrinsically_ construcible measure of evolution to an agent, then a kind fo stochastic evolution (or probabilistic evolution) seems the only thing at hand.

/Fredrik
 
  • Like
Likes mitochan
  • #61
vanhees71 said:
I'm not so sure about this. For me the great merit of Bell's idea is that he brought a pretty unsharp philosophical question about "reality" and the also pretty enigmatic ideas proposed in the (in)famous EPR paper (which Einstein himself didn't like too much) to a clear scientific empirically decidable question, namely whether with a local deterministic hidden-variable theory, starting from a clear mathematical definition of the statististical meaning of such a theory, all statistical predictions of quantum theory...

Just to add that Bell's theorems rule out any hidden-variable model (deterministic or stochastic) that satisfies "local causality" (defined appropiately).
 
Last edited:
  • Like
Likes vanhees71
  • #64
Now this is an interesting topic, and one that I'd hope to see discussed seriously little bit more often.

I'm going to react mostly to OP, albeit I have skimmed the responses in the thread too. I'm impressed to see the OP already setting many things right. Something that doesn't happen too often on this topic :)

Demystifier said:
Physicists often discuss interpretations of quantum mechanics (QM), but they rarely discuss interpretations of relativity. Which is strange, because the interpretations of quantum non-locality are closely related to interpretations of relativity.

Indeed, I find it extremely curious that Einstein never made any comments about how non-locality is very trivial to explain in Minkowski's space-time interpretation. Obviously getting feedback "from the future" is completely unproblematic concept if you have already set all of reality as static (a.k.a. "Transactional Interpretation of Quantum Mechanics").

That's not to say Minkowski's idea is unproblematic - it's just to say that non-locality in the EPR circumstance is not an insurmountable problem.

Perhaps it's a sign that Einstein did not really view Minkowski's perspective necessarily as ontologically realistic concept, but rather as a useful mental model. Even after applying that model so comprehensively in the formulation of GR. I think this is rather likely; surely he was well aware that "ontologically real" relativistic simultaneity immediately requires a static universe - and a detachment of consciousness from that static reality.

But it gets more interesting than that.. Let me segway to it via this;

Demystifier said:
3. Ether interpretation. This is not really one interpretation but a wide class of different physical theories. One simple version of the ether theory was developed by Lorentz, before Einstein developed his theory of relativity in 1905. According to ether theories, there are absolute space and absolute time, but under certain approximations some physical phenomena obey effective laws of motion that look as if absolute space and time did not exist. The original Lorentz version of ether theory was ruled out by the Michelson-Morley experiment, but some more sophisticated versions of ether theory are still alive.

This last sentence is quite inaccurate, as Lorentz's ether theory was actually created in response to M&M experiment. His theory is what introduced Lorentz Transformation to us, and it differs from Special Relativity only in philosophical sense (a fact that was well known back in the day, but seems to be often lost in modern descriptions of SR). In fact, Einstein's "On the Electrodynamics of Moving Bodies" was originally called Lorentz-Einstein theory. And that is why we still call the transformation in it Lorentz Transformation.

In modern descriptions the history of SR is often characterized as "first we had Lorentz Aether Theory, then along game M&M, and then Einstein explained it with SR". That is quite a caricature of the actual history.

Now think about this - the historical expectation that M&M experiment should have revealed a universal reference frame for EM propagation, completely hinges on the assumption that space and matter would have completely decoupled existence from one another (e.g. that EM propagation that binds objects is not dependent on the one-way speed of C).

Lorentz (and FitzGerald before him) started to hypothesize on the possibility that macroscopic objects - as manifestation of electromagnetism in themselves - might be dependent on one-way speed of C. That analysis yields Lorentz Transformation as a valid transformation between reference frames. This was a decade before Special Relativity, and it is 100% the same math as used in SR.

Obviously the idea that electromagnetic objects would be dependent on the propagation of electromagnetic fields is not such an "ad-hoc" idea in itself - indeed it is exactly the idea behind one of our most successful modern theory, the Quantum Field Theory. I find the idea of decoupled existence of matter and space much harder to reconcile into a self-consistent model.

Now, the order of historical events is somewhat relevant here;

Demystifier said:
1. Operational interpretation. According to this interpretation, relativity is basically about how the appearance of space, time and some related physical quantities depends on motion (and current position) of the observer. Essentially this is how Einstein originally interpreted relativity in 1905.

Einstein's original paper is indeed quite neutral in philosophical sense. It is employing Lorentz Transformation between reference frames exactly like Lorentz own version, but the arguments to get there are different. The first part of the paper revolves around the well-known fact that one-way speed of light is fundamentally impossible to measure, since you have no means to synchronize your two clocks (this is always trivially true to the fastest information speed available to you - also a commonly known issue in physics community at the time, but nowadays seems to be somewhat lost somewhere in bad education).

Einstein argues that since it is fundamentally impossible to synchronize two spatially separated clocks, it must always be logically and observationally valid to do all your calculations under the assumption that speed of light is exactly C in any given reference frame you want - you will never be able to find yourself being wrong - so long that you apply Lorentz transformation between the frames. (And why this is - logically speaking - should be trivially understandable to anyone who understands Lorentz transformation)

In Lorentz theory the situation is the same - as an electromagnetic creature part of electromagnetic universe, you are fundamentally unable to measure if you are using a correct universal reference frame or not. The logical conclusions in the observational limit are all equally identical to SR.

The philosophical discussion between these two flavors of the same mathematical transformation really boils down to the question - does relativistic simultaneity represent merely our observation limit, or natural structure of reality.

It is true that SR version of it was philosophically more neutral - it did not posit to know "why" relativistic simultaneity is valid in ontological sense, only in purely logical sense.

But then this happened;

Demystifier said:
2. Spacetime interpretation. According to this interpretation, relativity is not so much about the appearance of space and time to observers, as it is about the 4-dimensional spacetime that does not depend on the observer. This interpretation was first proposed by Minkowski. Einstein didn't like it in the beginning, but later he embraced it in his formulation of general theory of relativity. The spacetime interpretation naturally leads to the block-universe interpretation of the world, according to which time does not flow, meaning that the past, the presence and the future exist on an equal footing.

In this interpretation Minkowski simply draws out the fact that, if you take relativistic simultaneity as a real feature of reality, it instantly leads into completely static universe (reality around you cannot have an instantaneous state to it - as observers coinciding with your location but not with your inertial frame would disagree with your idea of what that state is - none of the states would be real prior to observation)

Indeed Einstein expressed dissatisfaction of this idea, but also he did come dangerously close to this conclusion himself the very moment he argued that there's no rational reason to posit that unobservable things such as a universal reference frame for C exists. It is very difficult to reconcile his version with realism without ending up with exactly Minkowski's idea of static reality. (And yet in modern descriptions of Relativity, exactly that idea is thrown around quite willy-nilly)

The philosophical problem of Einstein's argument is that it was not actually neutral either - arguing that observational limits are also limits of existence leads into a specific structure of reality that makes it in itself a philosophical assumption. As he found out when he was arguing the exact opposite perspective in the context of Quantum Mechanics ("surely the Moon must be there when you are not observing it").

This same philosophical stance is taken in many areas of modern physics - for example when arguing that Planck limit is not just an observational limit but also a limit of "existence of things". And it leads into similar complications (which I could discuss also in length).

Now it would be interesting to also discuss the fact that Big Bang Theory is effectively suggesting a universal reference frame (via the supposed simultaneous of emission of cosmic microwave background radiation), and as such can be seen as establishing universal simultaneity (i.e. frame of Lorentz' ether) if you wish to apply the "unobservable things do not exist"-adage. But since the thread is about Bell theorem and non-locality, so let me cut to the chase and point out something rather interesting right there.

What Bell theorem means is that no local realist hidden variable theory can make the same predictions as quantum mechanics. Meaning, it only applies to theories where some hidden variables determine the state of a real object, where that real object exists prior to observation. Meaning, it does not imply local realism is dead, as it only applies to that class of theories that assumes that wave-behavior exhibiting particles objects (such as photons) actually do exist, even though we are only observing detection interactions!

It is very interesting to me that Einstein (and, basically, everyone) always missed the very real possibility that the objects we detect and call "particles" are merely quantized detection events, manifested by wave energies. This idea would have landed squarely on Einstein's "only observable things are real" philosophy. Perhaps his reluctance to think of this possibility had something to do with the fact that he played a very integral role in the conception of "photons" in the first place. At the end of the day, no one has ever seen a photon, we have only seen detection events that imply a model where they exist. Detection events that could be explained also by positing a quantized mechanism to the interaction mechanism itself (as oppose to positing the existence of quantized carriers).

In my mind it's very simple; Bell theorem is an explanation as to why in our models "particles" cannot actually be placed there where we see wave-like behavior, unless we are also ready to throw out either locality or realism.

So how about instead of throwing away realism or locality, we throw away the idea of particles? In that case, actually a local realist explanation of Bell experiment becomes quite trivial. Place an observational limit (instead of "existence limit") to quantized EM detection events (you can't observe it unless it manifests an interaction event), and what you get is fully wave-like propagation of EM energy from emission to the two detection sites. Modification of the wave-like energy through polarization filters (or any mechanism that do not cause a "collapse" - i.e. yield an actual detection event) would yield a cosine correlation to the "probabilities of quantized detection interactions to occur". Not a great surprised - the wave propagation is best described by Schrödinger's Equation - so if we manage to keep the propagation as waves, from emission to detection, we expect to always get a result that is fully aligned with QM expectations, while maintaining fully ordinary local realist mechanisms.

The critical difference is - you don't have the idea of a free-flight particle with definitive properties to themselves - the properties we observe are only determined by the actual interaction event, which in itself is quantized (so it's occurrence is probabilistic, depending on the underlying wave reality - and it may not occur at all even when the underlying wave energy does exist). In this case, any modifications to the detection probabilities at the two sites will yield a cosine correlation (a completely ordinary wave feature) and this would not be possible if there was particles in free-flight (sans letting go realism or localism).

If the above hand-wavy description doesn't explain the crux of it, I wrote a more complete description of this same fact starting from page 8 here.
(Assuming you are familiar with the wave description of polarization filters. If not, a short description of those appear earlier in that same article)

So, what's the point of all this? The point is, do not simply assume that "if it can't be measured, it does not exist". Contrary to popular belief, that philosophy does not automatically yield an ideally elegant "Occam's Razor" philosophy. It can be effective if applied right, but sometimes it just makes your models more convoluted down the line.

-Anssi
 
  • Skeptical
  • Like
Likes Lynch101, gentzen, weirdoguy and 1 other person
  • #65
AnssiH said:
I find it extremely curious that Einstein never made any comments about how non-locality is very trivial to explain in Minkowski's space-time interpretation.
Nonlocality by itself can be accommodated by an interpretation like the Transactional Interpretation, as you say, yes.

What cannot be accommodated by any interpretation involving classical spacetime is superposition. For example, suppose we set up a "Schrodinger's cat" type experiment where, instead of a random quantum event like a radioactive decay determining whether a cat is alive or dead, have it determine whether or not a significant change in the distribution of matter occurs--for example, whether a ball with enough mass to register in a Cavendish-type experiment goes to the left or to the right. No classical spacetime model can describe this experiment, because it involves a superposition of different spacetime geometries (more precisely, it involves the entanglement of the spacetime geometry with other degrees of freedom). In a classical spacetime model, there is only one spacetime geometry. The geometry can be determined dynamically by the distribution of matter, but there is no way to model a superposition of different matter distributions being entangled with the spacetime geometry and causing a superposition of different spacetime geometries.
 
  • #67
PeterDonis said:
Nonlocality by itself can be accommodated by an interpretation like the Transactional Interpretation, as you say, yes.

What cannot be accommodated by any interpretation involving classical spacetime is superposition. For example, suppose we set up a "Schrodinger's cat" type experiment where, instead of a random quantum event like a radioactive decay determining whether a cat is alive or dead, have it determine whether or not a significant change in the distribution of matter occurs--for example, whether a ball with enough mass to register in a Cavendish-type experiment goes to the left or to the right. No classical spacetime model can describe this experiment, because it involves a superposition of different spacetime geometries (more precisely, it involves the entanglement of the spacetime geometry with other degrees of freedom). In a classical spacetime model, there is only one spacetime geometry. The geometry can be determined dynamically by the distribution of matter, but there is no way to model a superposition of different matter distributions being entangled with the spacetime geometry and causing a superposition of different spacetime geometries.

Hi Peter :)

Actually in Transactional Interpretation there's no need to accommodate for the concept of superposition. It is a mysterious concept only in Copenhagen (for reasons I outline in my post). In TI there are also probabilistic components to our expectations, but it wouldn't mean those objects would actually be in superposition - it would just mean there are probabilistic outcomes to our expectations.

The only reason why superposition is not viewed as a component of observer ignorance is Bell experiments, and as soon as you have a mechanism to explain them, you have no superposition anymore.

Cheers,
-Anssi
 
  • Skeptical
  • Sad
Likes gentzen and PeroK
  • #68
PeterDonis said:
This is not a valid reference for PF discussion. Are there any published papers that describe this model?

If you read it, you will see it's not even model, but merely a discussion of an interpretation of QM.

And since it's written by me, it should be in accordance to the guidelines for me to discuss it, in so far as I can tell.

If you feel otherwise, I can remove the link and replace it with the same text that is found behind the link (really, not that different from what I'm discussing in the post, just explaining the same issues in more detail).

Not trying to be facetious, but this area of the forum is decidedly about QM interpretation, and the text behind the link is actually discussing the direct consequences of completely established models of modern physics (such as the standard view of refraction in transparent materials, or the behavior of polarization filters).

Surely it must be according to the guidelines to discuss the impact of completely established theories to possible interpretations of QM. (I mean if it isn't, then why are we here :smile:)

-Anssi
 
  • #69
AnssiH said:
in Transactional Interpretation there's no need to accommodate for the concept of superposition.
There is if you want to try to apply it to QM, as you are doing here.

AnssiH said:
The only reason why superposition is not viewed as a component of observer ignorance is Bell experiments, and as soon as you have a mechanism to explain them, you have no superposition anymore.
Superposition is part of the basics of QM. You can't just wave your hands and say it goes away.
 
  • #70
AnssiH said:
If you read it, you will see it's not even model, but merely a discussion of an interpretation of QM.
Your post #64 goes well beyond "discussion of an interpretation of QM". It is your personal research unless you can give a reference to an already published, peer-reviewed paper that supports the claims you are making.

AnssiH said:
And since it's written by me, it should be in accordance to the guidelines for me to discuss it, in so far as I can tell.
PF rules prohibit discussion of personal research. It's personal research unless and until you get it published in a peer-reviewed journal.

AnssiH said:
If you feel otherwise, I can remove the link and replace it with the same text that is found behind the link
That wouldn't change any of the above.

AnssiH said:
the text behind the link is actually discussing the direct consequences of completely established models of modern physics (such as the standard view of refraction in transparent materials, or the behavior of polarization filters).
Then you need to give references to the "completely established models" that make the claims you are making.
 

Similar threads

  • Poll
  • Quantum Interpretations and Foundations
Replies
10
Views
301
  • Quantum Interpretations and Foundations
11
Replies
376
Views
11K
  • Quantum Interpretations and Foundations
3
Replies
84
Views
2K
  • Quantum Interpretations and Foundations
9
Replies
314
Views
16K
  • Quantum Interpretations and Foundations
2
Replies
37
Views
2K
  • Sticky
  • Quantum Interpretations and Foundations
Replies
1
Views
4K
  • Quantum Interpretations and Foundations
Replies
21
Views
833
  • Quantum Interpretations and Foundations
2
Replies
61
Views
4K
  • Quantum Interpretations and Foundations
9
Replies
309
Views
9K
  • Quantum Interpretations and Foundations
2
Replies
41
Views
4K
Back
Top