Is the wave function real or abstract statistics?

In summary: They use logic and mathematics to show that the wave function is in one-to-one correspondence with its "elements of reality."
  • #71
bhobba,

You said:

Aside from the Born rule a state tells us nothing at all. States are not probable - they are used to predict probabilities, but are themselves not probable.

Tell me, how can these states predict probabilities that are not an underlying reality?

These states have to describe the reality of the quantum system in order to predict probable states of the system. Theses states tell you how the system behaves because these states are describing the underlying reality of the system. How can you say these states can predict probabilities of the system if the states don't describe an underlying reality of the system?

In order to predict probabilities the wave function has to contain all measurable states of the system. Guess what? It does.
 
Physics news on Phys.org
  • #72
matrixrising said:
cthugha,
When you look at Lundeen, he showed a one to one correspondence with the spatial wave function of a SINGLE PHOTONS with the spatial wave function of an ensemble of photons. The spatial wave function of a single photon was reconstructed over an ensemble of photons. It just doesn't get much clearer than that.

A single particle does not have a wave function. A single photon state has. This is what Lundeen analyzed. It is that simple. This is simple basic qm. I have already given you a paper explicitly stating that you cannot meaningfully discuss these properties for single particles in post #31. There is a recent Nature photonics paper by Boyd and his group (Nature Photonics 7, 316–321 (2013)), applying Lundeen's technique to measure the polarization state of light directly. They also make clear that it does not work for a single particle:

"For a single photon, the weak measurement has very large uncertainty, so the above procedure must be repeated on many photons, or equivalently on a classical light beam, to establish the weak value with a high degree of confidence."

matrixrising said:
Like I said, Ballentine shows zero evidence that the wave function isn't real. All I see is a bunch of conjecture that's born out of the desire to remove the mysteries of QM whatever that means. It's just shut up and calculate.

Of course it does not show evidence that the wave function is not real. There is no evidence that the wave function isn't real. There is also no evidence that it is real. This is why these are interpretations. None of them has better evidence. None is more valid than the others. You can interpret the wave function as realistic, but you do not have to.

matrixrising said:
"For the notion that probabilistic theories must be about ensembles implicitly assumes that probability is about ignorance. (The “hidden variables” are whatever it is that we are ignorant of.) But in a non-determinstic world probability has nothing to do with incomplete knowledge, and ought not to require an ensemble of systems for its interpretation".

Ought not...well, one can have this opinion, yes. One does not have to. Personally, I like Mermin's Ithaqa interpretation, although it is not too consistent.

matrixrising said:
A minimalist interpretation of QM is another form of shut up and calculate which is lacking. Where's the evidence that the quantum system isn't in multiple "real" states prior to measurement? This is what gives rise to the quantum properties that we see in experiment after experiment.

There is no evidence. Nobody in this thread claimed there is. You just claimed the wave function has to be interpreted as real. Everybody else says, it can, but you do not have to.

matrixrising said:
In fact, how can we do calculations on probable states if these probable states are not real when it comes to quantum computing?

Yes.
matrixrising said:
The Ensemble Interpretation states that superpositions are nothing but subensembles of a larger statistical ensemble. That being the case, the state vector would not apply to individual cat experiments, but only to the statistics of many similar prepared cat experiments. Proponents of this interpretation state that this makes the Schrödinger's cat paradox a trivial non issue. However, the application of state vectors to individual systems, rather than ensembles, has explanatory benefits, in areas like single-particle twin-slit experiments and quantum computing. As an avowedly minimalist approach, the Ensemble Interpretation does not offer any specific alternative explanation for these phenomena.

Ehm, I do not see the point. Of course there can be explanatory benefits for special experiments in certain interpretations. This is why there are so many of them. In the ensemble interpretation, quantum computers just work because qm says so. Yes, I agree that this might not be great from a didactics point of view. The disadvantage of minimal interpretations for some people is that it says that things work because the math says so. The advantage of minimal interpretations for some people is that it says that things work because the math says so.

matrixrising said:
The single particle has to be in two real states in order for a calculation to occur. The single particle can be in two real states or a qubit prior to measurement.

No.

matrixrising said:
This is from a paper titled A single-atom electron spin qubit in silicon.

I do not get it. Electron spins make goos qubits. I worked on some of them myself. What is this going to tell us?

matrixrising said:
Of course all of these states are real and that's the point. All of these states are coherent and real prior to measurement and this is why we can show single particles in a state of superposition. So the underlying reality of the system(particle) is real. This underlying reality is the wave function in a pure coherent state where pure states simultaneously exist prior to decoherence.

Maybe. Maybe not. Can you show that it must be this way? Do you have more than one account by the way? The number of people repeating almost the same sentences increased significantly during the last week.

matrixrising said:
Of course he was wrong when he said:

"Like the old saying "A watched pot never boils", we have been led to the conclusion that a continuously observed system never changes its state! This conclusion is, of course false.

Wrong.

Yes. Of course he was. It is the same common fallacy. Trying to think one interpretation is better than the others. This idea is always doomed.

matrixrising said:
One last thing. there was a poll taken by Anton Zeilinger at the Quantum Physics and Nature of Reality conference in Austria in 2011. Here's what they thought about ensemble interpretations.

Right interpretation of state vectors:

27%: epistemic/informational
24%: ontic
33%: a mix of epistemic and ontic
3%: purely statistical as in ensemble interpretation
12%: other


As you see, the ensemble interpretation got 3%.

Yes, indeed physicists interested in interpretations are usually seeking something fundamental from an interpretation. Maybe a good ontology like Bohmians. Or something else. These naturally find ensemble interpretations lacking. That is fine. The ensemble interpretation is a minimalist interpretation preferred usually by working physicists who want to stay clear of exactly the kind of discussion we have here.

matrixrising said:
So again, the ensemble interpretation flies in the face of experiment after experiment. It's a way of saying Quantum weirdness can't be objectively real but the truth is, it's an underlying reality for the quantum system not the classical experience.

There is no experimental evidence against (or for) the ensemble interpretation - or any other standard interpretation.
 
Last edited:
  • #73
matrixrising said:
Tell me, how can these states predict probabilities that are not an underlying reality?

Well you keep using 'underlying reality'.

How about explaining what you mean by 'underlying reality'.

Generally in QM reality means something has a value with a dead cert. If that's what you mean Kochen-Specker says that's impossible. How is it possible - nature is like that. Einstein didn't like it but was forced to accept it - no escaping - you can't ague with a theorem.

But that's perhaps not what you mean by reality.

So let's see your definition of it. Einstein gave a definition in his EPR paper but was proven wrong.

Thanks
Bill
 
  • #74
matrixrising said:
Like I said, Ballentine shows zero evidence that the wave function isn't real.

Exactly what books by Ballentine have you read? I have carefully studied his standard text, including chapter 9, where he carefully explains why it can't be real. Precisely what part of his argument is wrong? IMHO it has a few outs - but let's hear your reasons.

And if you have in fact studied his book, you should be able to state the proper form of the Born rule - its axiom 2 in Ballentine's book. Care to tell us what it is? And if you can't exactly why are you arguing about something you do not know the details of?

Thanks
Bill
 
Last edited:
  • #75
bhobba said:
Well you keep using 'underlying reality'.

How about explaining what you mean by 'underlying reality'.

His naive interpretation of the experiments and papers(Lundeen, etc) indicates that he is making the common novice mistake of identifying the wave function with real individual particles(the classical ones).
He then goes on to confuse this with the abstract obvious notion that probabilities obtained in QM are referred to the "underlying reality", they would be of no use otherwise, that is just saying QM works in different words.
Threads like this might seeem not very useful judging by what the specific individual OP seems to be taking out of it as he replies, but they are useful for what people approaching QM for the first time may take away in terms of not falling for these mistakes.
 
  • #76
cthugha,

Again, the reason why ensemble interpretations got 3% at the conference is because it's an interpretation that says just look away and do the math and oh by the way, QM can't say this or that even though it does. It's like the example given of the unstable nuclei. He made a great point when he said:

However, you implicitly make statements that quantum mechanics can't achieve certain things – even though it can.

Lundeen directly measured the wave function of a single particle. There was a one to one correspondence with the spatial wave function of a single photon and an ensemble of photons.

There's a difference between direct measurement of the wave function of a single photon and knowing the position and momentum of a single photon. Let Lundeen explain:

The wavefunction is the complex distribution used to completely
describe a quantum system, and is central to quantum theory. But
despite its fundamental role, it is typically introduced as an abstract
element of the theory with no explicit definition1,2. Rather, physicists
come to a working understanding of the wavefunction through its
use to calculate measurement outcome probabilities by way of the
Born rule3. At present, the wavefunction is determined through
tomographic methods4–8, which estimate the wavefunction most con-
sistent with a diverse collection of measurements. The indirectness of
these methods compounds the problem of defining the wave-
function. Here we show that the wavefunction can be measured
directly by the sequential measurement of two complementary vari-
ables of the system. The crux of our method is that the first measure-
ment is performed in a gentle way through weak measurement9–18,
so as not to invalidate the second. The result is that the real and
imaginary components of the wavefunction appear directly on our
measurement apparatus. We give an experimental example by
directly measuring the transverse spatial wavefunction of a single
photon, a task not previously realized by any method. We show that
the concept is universal, being applicable to other degrees of freedom
of the photon, such as polarization or frequency, and to other
quantum systems—for example, electron spins, SQUIDs (super-
conducting quantum interference devices) and trapped ions.
Consequently, this method gives the wavefunction a straightforward
and general definition in terms of a specific set of experimental
operations19. We expect it to expand the range of quantum systems
that can be characterized and to initiate new avenues in fundamental
quantum theory.

This is a letter from Lundeen to Nature about the experiment. Here's the kicker:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


Again there's a difference between the direct measurement of a single photons wave function and knowing the position and momentum of a single photon. The first, Lundeen achieved, the second can't be known. He ends the letter with this.

In our direct measurement method, the wavefunction manifests
itself as shifts of the pointer of the measurement apparatus. In this
sense, the method provides a simple and unambiguous operational
definition19 of the quantum state: it is the average result of a weak
measurement of a variable followed by a strong measurement of the
complementary variable. We anticipate that the simplicity of the
method will make feasible the measurement of quantum systems
(for example, atomic orbitals, molecular wavefunctions29, ultrafast
quantum wavepackets30) that previously could not be fully characterized.
The method can also be viewed as a transcription of quantum state of
the system to that of the pointer, a potentially useful protocol for
quantum information

Again, the direct measurement of a single photons wave function that corresponds to an ensemble of photons with the same spatial wave function.

Again, ensemble interpretations basically say shut up and calculate and if experiments say x it's meaningless because it's just the math. It just doesn't make much sense like Ballentine and the Quantum Zeno Effect.

bhobba,

When I say underlying reality, I'm talking about the wave function. The wave function is the underlying reality of the quantum system. It contains all the measurable information about the system. It's the pool table analogy. The pool table contains all the measurable information that the pool balls can be in. The pool balls themselves don't have to be in a measured state (8 ball in the corner pocket) in order for the pool table to contain all measurable states that the pool ball can be in.

These are real states that we can perform quantum calculations on. So the pool table contains all measurable information about the system(pool balls). In this case, the pool balls would represent the measurable information of the pool table in a decohered state.
 
  • #77
matrixrising said:
Again, the reason why ensemble interpretations got 3% at the conference is because it's an interpretation that says just look away and do the math and oh by the way, QM can't say this or that even though it does.

Ehm, does it? QM does not say anything testable about single realizations. Also, it is quite misleading to distinguish between ensemble and epistemic interpretations. The border is not that well defined. Actually the informational/subjectivist interpretations are even further away from your point of view than the ensemble one is and are led by the opinion that there is no reality without measurement.

matrixrising said:
Lundeen directly measured the wave function of a single particle. There was a one to one correspondence with the spatial wave function of a single photon and an ensemble of photons.

I already explained why that is wrong and what directly means in this case. What is so difficult about that?

matrixrising said:
There's a difference between direct measurement of the wave function of a single photon and knowing the position and momentum of a single photon. Let Lundeen explain:
[...]This is a letter from Lundeen to Nature about the experiment. Here's the kicker:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.

Yes, I know that paper well and the wording Lundeen used. The paper got criticized quite heavily at conferences for using that simplified wording (admittedly it helps selling it of course). The correct formulation would have been that they WEAKLY measured the wave function, which has a very different meaning. As I already cited: "For a single photon, the weak measurement has very large uncertainty, so the above procedure must be repeated on many photons, or equivalently on a classical light beam, to establish the weak value with a high degree of confidence." The single weak measurement does not give any useful information about the single particle. It is just the ensemble average that does. This is naturally so. If it gave more information, it would be a strong measurement. The averaged value you get comes from an ensemble measurement and is not done on a single particle. Actually you get different weak values when measuring several different particles. If they had measured the identical true wave function on a single particle, they would have got exactly the same result every time.

matrixrising said:
Again, the direct measurement of a single photons wave function that corresponds to an ensemble of photons with the same spatial wave function.

There are interpretations which allow you to interpret a weak measurement as a measurement on a single photon, yes. But in these interpretations that result on its own is meaningless. The wave function values shown do not come from a single measurement on a single particle. They come from many measurements on single particles and averaging over the weak values. One can claim that the single measurement of the wave function should indeed be interpreted as a measurement of the wave function of a single particle. However, this automatically means that a result of say 5 +/- 739 is a reasonable result. Yes, you can con consider this as a measurement, but is the result on its own meaningful? In my opinion, it is misleading to say that the result has a correspondence with the wave function because this single result does not tell you anything.

Seeing that is difficult, though as weak measurements are a complicated topic. It helps following the topic from the initial paper on weak measurements (Phys. Rev. Lett. 60, 1351–1354 (1988), http://prl.aps.org/abstract/PRL/v60/i14/p1351_1) to the more modern viewpoint of quantum ergodicity (http://arxiv.org/abs/1306.2993).

In a nutshell, if you consider a measurement result of 100 as a good result when measuring the spin of a spin 1/2 particle, Lundeen has performed a measurement on a single particle. But that is pretty trivial and does not mean much. It is like measuring the opinion of all Americans by just asking Steve Miller from Arkansas and nobody else. Yes, you can consider that a measurement. No, it is not meaningful on its own. Therefore, calling a single weak measurement a measurement on a single particle is just semantics to me.
 
Last edited:
  • #78
chutgha,

I think you're misreading Lundeen because you're looking at it through the eyes of an ensemble interpretation. This is why I don't like ensemble interpretations. Results are never results even though they're results. It seems the goal of ensemble interpretations or the small percentage that follow them is to label every result meaningless that makes ensemble interpretations meaningless.

This is what Lundeen said:

The average result of the weak measurement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-function. At each x, the observed position and momentum shifts of the measurement pointer are proportional to ReY(x) and ImY(x),respectively. In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-function of the single particle.

That's pretty straightforward and simple measurement of a single photons wave function.

What happened?

Lundeen first did a weak measurement and then a strong measurement was performed. By reducing the disturbance by performing a weak measurement first and then a strong measurement, he measured the wave function of a single photon.

At each x, (wave function of the individual photon) the observed position and momentum shifts of the measurement pointer were proportional to the real and imaginary parts of the wave function.

It's like the blueprint to build a Lexus is the wave function. It contains all the measurable information you need to build a Lexus. Each individual state(car door, trunk, hood) has to be real and proportional to a Lexus in order to build a Lexus.

So in this experiment, every direct measurement of a single photons wave function is proportional to a Lexus(wave function). You don't get a car door of a 1983 Buick or the trunk of a Cadillac. The wave function of a single photon is proportional to the real and imaginary part of the wave function.

This is why Lundeen said:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


So a singular photons wave function is proportional to an ensemble of photons as it should be.
 
  • #79
matrixrising said:
I think you're misreading Lundeen because you're looking at it through the eyes of an ensemble interpretation. This is why I don't like ensemble interpretations. Results are never results even though they're results. It seems the goal of ensemble interpretations or the small percentage that follow them is to label every result meaningless that makes ensemble interpretations meaningless.

No, I am not misreading it. I know that paper pretty well.

matrixrising said:
This is what Lundeen said:
The average result of the weak measurement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-function. At each x, the observed position and momentum shifts of the measurement pointer are proportional to ReY(x) and ImY(x),respectively. In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-function of the single particle.

That's pretty straightforward and simple measurement of a single photons wave function.

You still need an average result. This is more than just a minor nuisance. See my comment below.

matrixrising said:
Lundeen first did a weak measurement and then a strong measurement was performed. By reducing the disturbance by performing a weak measurement first and then a strong measurement, he measured the wave function of a single photon.

At each x, (wave function of the individual photon) the observed position and momentum shifts of the measurement pointer were proportional to the real and imaginary parts of the wave function.

This is of course misleading at best. It is not the observed shift, but the observed AVERAGE shift which is proportional to the wave function. This is a huge difference.

In particular it is even against your position. Results of a weak measurement do not follow what you call the underlying reality. If you measure the spin of a spin 1/2 particle, you can get a weak value of 100 which is known to be not possible. If you measure your position weakly, the weak measurement can tell you that you are on Lexaar. If you measure weakly what the pitcher on the baseball field will do, you may find that he skates around the goalie, raises his hockey stick and scores. If you perform a weak measurement of who will win the superbowl this year, you may get the Giants or Tampa Bay as the result.

One needs to follow the whole literature about weak wave function measurements to understand what is going on. The nature paper has limited space and necessarily explains little which is a standard problem when trying to publish in nature or science. You need to write a condensed manuscript. Additional explanations on the meaning of their wave function and the meaning of "direct" have been given by Lundeen and Bamber in PRL 108, 070402 (2012), explaining that they have shown a "general operational definition of the wave function based on a method for its direct measurement: ‘‘it is the average result of a weak measurement of a variable followed by a strong measurement of the complementary variable [1,2].’’ By ‘‘direct’’ it is meant that a value proportional to the wave function appears straight on the measurement apparatus itself without further complicated calculations or fitting." and most importantly
"While a weak measurement on a single system provides little information, by repeating it on an arbitrarily large ensemble of identical systems one can determine the average measurement result with arbitrary precision."
and also "Surprisingly, the weak value can be outside the range of the eigenvalues of A and can even be complex".

It has also been investigated in Phys. Rev. A 84, 052107 (2011) which gives a more rigorous mathematical treatment showing that "that the weak values can be exhaustively explained within the quantum theory of sequential measurements." and also that one cannot measure arbitrary states using the technique.

The great thing about Lundeen's paper is the directness of the measurement - the fact that you do not have to run any tomography program afterwards. If you ever get to spend some time waiting for quantum state tomography to do its job, you will know what I mean. It can take weeks for large chunks of data.
 
Last edited:
  • #80
ctugha,

This is the point that was made about ensemble interpretations. There's no evidence, even when there is evidence. Ecerything is just meaningless even though it has been shown to have meaning. You said:

"While a weak measurement on a single system provides little information, by repeating it on an arbitrarily large ensemble of identical systems one can determine the average measurement result with arbitrary precision."

The key here:

PROVIDES LITTLE INFORMATION NOT MEANINGLESS INFORMATION.

It's like a small signal at a Collider. When the signal is repeated it can give you a better understanding of the signal. What Lundeen is saying is that you can't look at the weak measurement of a single system in isolation in order to see the big picture. It doesn't mean that the direct measurement of the photons wave function is meaningless. It's just like my example of the Lexus. You need the Lexus doors to make the car. The doors in isolation will not give you the car but that doesn't make them meaningless. I quote Lundeens letter to Nature one more time:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


How do you leap to the conclusion that this is meaningless?

Results of weak measurement do follow what I call underlying reality.

This can be explained by the Aharonov–Albert–Vaidman effect. This is from Wiki:

The weak value of the observable becomes large when the post-selected state, |\phi_2\rangle, approaches being orthogonal to the pre-selected state.

Here's more from a PDF Lev Vaidman on weak value and weak measurement:

The real part of the weak value is the outcome of the standard measurement pro-
cedure at the limit of weak coupling. Unusually large outcomes, such as
spin 100 for a spin− 1 particle [2], appear from peculiar interference effect
(called Aharonov–Albert–Vaidman (AAV) effect) according to which, the superpo-
sition of the pointer wave functions shifted by small amounts yields similar wave
function shifted by a large amount. The coefficients of the superposition are univer-
sal for a large class of functions for which the Fourier transforms is well localized
around zero.

In the usual cases, the shift is much smaller than the spread Δ of the initial state
of the measurement pointer. But for some variables, e.g., averages of variables of a
large ensemble, for very rare event in which all members of the ensemble happened
to be in the appropriate post-selected states, the shift is of the order, and might be
even larger than the spread of the quantum state of the pointer [5]. In such cases the
weak value is obtained in a single measurement which is not really “weak”.

There have been numerous experiments showing weak values [7–11], mostly of
photon polarization and the AAV effect has been well confirmed. Unusual weak
values were used for explanation peculiar quantum phenomena, e.g., superluminal
velocity of tunneling particles [12,13]. ( Superluminal communication; tunneling).

When the AAV effect was discovered, it was suggested that the type of an am-
plification effect which takes place for unusually large weak values might lead to
practical applications. Twenty years later, the first useful application has been made:
Hosten and Kwiat [14] applied weak measurement procedure for measuring spin
Hall effect in light. This effect is so tiny that it cannot be observed without the
amplification.

So again, saying things are meaningless doesn't mean they're meaningless.
 
  • #81
Are you kidding me?

I gave you the reference to the original and complete Vaidman paper earlier and you cite from a summary about it?

So as you refuse to read the original paper, let me state explicitly what Vaidman himself states about the very measurement you are talking about in the real paper:

"In the opposite limit, where Delta pi is much bigger than all a_i, the final probability distribution will be again close to a Gaussian with the spread Delta pi. The center of the Gaussian will be at the mean value of A: <A> =sum_i |a_i|^2 a_i. One measurement like this will give no information because Delta pi>><A>; but we can make this same measurement on each member of an ensemble of N particles prepared in the same state, and that will reduce the relevant uncertainty by the factor 1/sqrt(N), while the mean value of the average will remain <A>. By enlarging the number N particles in the ensemble, we can make the measurement of <A> with any desired precision."

Let me emphasize that he literally says "no information", not "little information". Essentially, this is how weak measurements work. If and only if the single measurement is so weak that it does not give you any information on its own, it can be performed without disturbing the system.

Your statement "It's like a small signal at a Collider" and "Results of weak measurement do follow what I call underlying reality" are exactly what is not the case. A signal at a collider is still recorded via a strong measurement and thus governed by the eigenvalues of the measurement operator. A single weak measurement can give pretty much any result, mostly nonsensic ones like a spin of 100 for a spin 1/2 particle. It is a feature of weak measurements that the single results explicitly do NOT follow what you call underlying reality. A spin value of 100 is not possible for a spin 1/2 particle. Still it is a possible (and not even rare) result of a weak measurement. If you think that this is the same as a collider signal, you really need to understand weak measurements first.
 
  • #82
cthugha,

Let me quote it again.

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


A weak measurement followed by a strong measurement gives you enough information to directly measure the wave function of a single photon.
 
  • #83
It is enough to give you a single weak measurement value which - as Vaidman correctly states - contains no information.

By the way: Is it the case do not have access to those papers? Usually I assume that posting links to papers is enough. However, it does not help the discussion if I just post them and they go unread.
 
  • #84
cthugha,

Again, I quote you from earlier:

"While a weak measurement on a single system provides little information, by repeating it on an arbitrarily large ensemble of identical systems one can determine the average measurement result with arbitrary precision."

Again, in Lundeen, a weak measurement followed by a strong measurement allowed him to directly measure the wave function of a photon.

You seem to be avoiding Lundeen which says:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.
 
  • #85
matrixrising said:
You seem to be avoiding Lundeen which says:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.

Where am I avoiding him? He performs a single weak measurement of the wave function. You seem to have the impression that a single measurement of the wave function gives you the wave function. This is not the case. As Vaidman's statement above shows, it is even the case that a single weak measurement does not give you any information about the actual value of the quantity you just measured. I do not disagree with Lundeen. He performed a measurement of the wave function of the single particle. This single measurement just does not contain any information (well, the weak part). Only a huge number of repeated experiments does. Lundeen does a huge number of repeated experiments and gets the wave function.

If you disagree with that, please tell me, where exactly Vaidman is wrong, if possible using some peer reviewed evidence.
 
  • #86
Wrong again,

I never said a single weak measurement gives you the wave function. I said a weak measurement followed by a strong measurement reduces the disturbance and gives you the direct measurement of a single particles wave function. From Lundeen.

How the experiment works:Apparatus for measuring the wavefunction

1. Produce a collection of photons possessing identical spatial wavefunctions by passing photons through an optical fiber.
2. Weakly measure the transverse position by inducing a small polarization rotation at a particular position, x.
3. Strongly measure the transverse momentum by using a Fourier Transform lens and selecting only those photons with momentum p=0.
4. Measure the average polarization rotation of these selected photons. This is proportional to the real part of the wavefunction at x.
5. Measure the average rotation of the polarization in the circular basis. (i.e. difference in the number of photons that have left-hand circular polarization and right-hand circular polarization). This is proportional to the imaginary part of the wavefunction at x.


You seem to be debating something that nobody has claimed. Again:

Weakly measuring the projector |x><x| followed by a strong measurement with result p=0 results in a weak value proportional to the wavefunction.
 
  • #87
matrixrising said:
Wrong again,

I never said a single weak measurement gives you the wave function. I said a weak measurement followed by a strong measurement reduces the disturbance and gives you the direct measurement of a single particles wave function.

Ehm, it reduces the disturbance compared to two strong measurements.

And yes, it is a direct weak measurement and it gives you one weak measurement result. I do not disagree with that. My question is simple: Do you think that this SINGLE weak measurement gives you actually some information about the wave function? If so, please tell me where Vaidman is wrong.

If you think it does not, then yes, we agree. They do a single measurement, but the single weak result does not correspond do any physical quantity. Lundeen himself is careful enough to acknowledge this point. He says "The average result of the weak measurement of πx is proportional to the wavefunction of the particle at x.". The average is. The single result is not.
 
Last edited:
  • #88
Neither Vaidmen or Lundeen said a single weak measurement gives you a direct measurement of a single particles wave function. Like I said earlier, I'm not sure what you're debating.

Lundeen said a weak measurement followed by a strong measurement reduces disturbance which results in a weak value proportional to the wave function. One more time:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


This is the direct measurement of a single particles wave function.
 
  • #89
matrixrising said:
Neither Vaidmen or Lundeen said a single weak measurement gives you a direct measurement of a single particles wave function. Like I said earlier, I'm not sure what you're debating.

I am debating your claim
"There was a one to one correspondence with the spatial wave function of a single photon and an ensemble of photons." and your claim that this proves something about underlying realities which go beyond statistical information.

In order to show that, Lundeen would have needed to actually get the full wave function of single photons by measurements on a single photon only. He never even intended to do that. He just wants to do a direct (nontomographic) measurement which actually gives you the wave function in the ensemble average over many weak measurements, which is indeed an important achievement.

matrixrising said:
This is the direct measurement of a single particles wave function.

But how is this connected to your above claim? Where is the correspondence with an underlying reality - whatever that may be? The very point of weak measurements is that this correspondence does not exist on the single measurement level. The wave functions measured are inherently ensemble averaged quantities. This experiment does not contain any well hidden information about every single photon. Lundeen explicitly acknowledges this at the end of the paper when he says: " In this sense, the method provides a simple and unambiguous operational definition of the quantum state: it is the average result of a weak measurement of a variable followed by a strong measurement of the complementary variable."
To Lundeen the wave function is related to the average, not to the single result.
 
  • #90
Of course it supports what I'm saying. You said:

To Lundeen the wave function is related to the average, not to the single result.

Of course it's related to a single result. How can you have an average if the single results are not proportional to the average?

If I give you the average PPG for Lebron James, his individual results will be proportional to the average. This is why Lundeen says:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


Bohm 2 gave a good example:

To understand what weak measurement is, the following analogy from everyday life is useful. Assume that you want to measure the weight of a sheet of paper. But the problem is that your measurement apparatus (weighing scale) is not precise enough to measure the weight of such a light object such as a sheet of paper. In this sense, the measurement of a single sheet of paper is - weak.

Now you do a trick. Instead of weighing one sheet of paper, you weigh a thousand of them, which is heavy enough to see the result of weighing. Then you divide this result by 1000, and get a number which you call - weak value. Clearly, this "weak value" is nothing but the average weight of your set of thousand sheets of papers.

But still, you want to know the weight of a SINGLE sheet of paper. So does that average value helps? Well, it depends:

1) If all sheets of papers have the same weight, then the average weight is equal to weight of the single sheet, in which case you have also measured the true weight of the sheet.

2) If the sheets have only approximately equal weights, then you can say that you have at least approximately measured the weight of a single sheet.

3) But if the weights of different sheets are not even approximately equal, then you have not done anything - you still don't have a clue what is the weight of a single sheet.

All of the sheets(particle wave functions) were identical. The weak value of a single photon corresponds to the average. In other words, I can look at Lebron James average PPG and then go back and compare that average to individual games throughout the year and they should correspond to one another.
 
  • #91
matrixrising said:
To Lundeen the wave function is related to the average, not to the single result.

Of course it's related to a single result. How can you have an average if the single results are not proportional to the average?

Do you know what a variance is? A huge variance compared to the mean does exactly this. This is basic first semester stuff.

matrixrising said:
If I give you the average PPG for Lebron James, his individual results will be proportional to the average.

This is a strong measurement. Not a weak one. In a weak measurement you would (to construct the analogy) also get results like -27 points in a game which clearly cannot have any sensible meaning.

matrixrising said:
All of the sheets(particle wave functions) were identical. The weak value of a single photon corresponds to the average. In other words, I can look at Lebron James average PPG and then go back and compare that average to individual games throughout the year and they should correspond to one another.

No! The important thing about weak values is that this is exactly not the case. That is a common fallacy. You consider a measurement with small variance, while weak measurements have huge variance. The thing you look for is called an element of reality. Vaidman himself said: "In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

Consider Vaidman's case of a spin 1/2 particle (which can have spin values of +1/2 and -1/2) which can yield a weak measurement spin value of 100 in a single measurement. How is that related to the average?

edit: To clarify further, let me cite Vaidman again:
"The weak value is obtained from statistical analysis of the readings of the measuring devices of the measurements on an ensemble of identical quantum systems. But it is different conceptually from the standard definition of expectation value which is a mathematical concept defined from the statistical analysis of the ideal measurements of the variable A all of which yield one of the eigenvalues ai."
 
Last edited:
  • #92
Again, Apples&Oranges. You quotes:

"In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

This isn't Lundeen from 2011. In this case the the value of the shift is determined by a stream of photons with identical wave functions. The weak values in the case of Lundeen corresponds to the average. Here's more:

At the centre of the direct measurement method is a reduction of the
disturbance induced by the first measurement. Consider the measure-
ment of an arbitrary variable A. In general, measurement can be seen as
the coupling between an apparatus and a physical system that results in
the translation of a pointer. The pointer position indicates the result of
a measurement. In a technique known as ‘weak measurement’, the
coupling strength is reduced and this correspondingly reduces the
disturbance created by the measurement9–18. This strategy also com-
promises measurement precision, but this can be regained by aver-
aging. The average of the weak measurement is simply the expectation
value ÆYjAjYæ, indicated by an average position shift of the pointer
proportional to this amount.

This is the ball game. A little more:

The average result of the weak mea-
surement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-
function. At each x, the observed position and momentum shifts of the
measurement pointer are proportional to ReY(x) and ImY(x),
respectively. In short, by reducing the disturbance induced by mea-
suring X and then measuring P normally, we measure the wave-
function of the single particle.

Finally the kicker:

The benefit of this reduction in precision
is a commensurate reduction in the disturbance to the wavefunction of
the single photon.


Again, measurements of a single photon correspond to the average. This way you get direct measurement of a single particles wave function.

You reduce the disturbance and you get an average of the weak value that's proportional to the wave function of a single particle.
 
  • #93
matrixrising said:
Again, Apples&Oranges. You quotes:

"In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

This isn't Lundeen from 2011. In this case the the value of the shift is determined by a stream of photons with identical wave functions.

Apples & oranges? Weak values and weak values. The physics of weak values does not change over night.

matrixrising said:
The weak values in the case of Lundeen corresponds to the average.

At the centre of the direct measurement method is a reduction of the
disturbance induced by the first measurement. Consider the measure-
ment of an arbitrary variable A. In general, measurement can be seen as
the coupling between an apparatus and a physical system that results in
the translation of a pointer. The pointer position indicates the result of
a measurement. In a technique known as ‘weak measurement’, the
coupling strength is reduced and this correspondingly reduces the
disturbance created by the measurement9–18. This strategy also com-
promises measurement precision, but this can be regained by aver-
aging. The average of the weak measurement is simply the expectation
value ÆYjAjYæ, indicated by an average position shift of the pointer
proportional to this amount.

Says who? Lundeen does not. The average weak values correspond trivially to the average. The single weak value of a single measurement clearly does not correspond to any element of reality.

matrixrising said:
This is the ball game. A little more:
The average result of the weak mea-
surement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-
function. At each x, the observed position and momentum shifts of the
measurement pointer are proportional to ReY(x) and ImY(x),
respectively. In short, by reducing the disturbance induced by mea-
suring X and then measuring P normally, we measure the wave-
function of the single particle.

Still, all about averaged values. Nothing about single weak measurement results.

matrixrising said:
Finally the kicker:

The benefit of this reduction in precision
is a commensurate reduction in the disturbance to the wavefunction of
the single photon.

Wait, do you have the impression that a single photon wave function is the wave function of a single realization? It is the wave function of identically prepared states containing one photon each.

matrixrising said:
Again, measurements of a single photon correspond to the average. This way you get direct measurement of a single particles wave function.

Ehm, as the single photon wave function is an ensemble average, this is trivial, no? The single measured weak values on a single realization, however, does not correspond to the average. It is usually even far off and far away from reasonable values.

matrixrising said:
You reduce the disturbance and you get an average of the weak value that's proportional to the wave function of a single particle.

Yes, the average. Sure.

For the last time: Tell me, where Vaidman is wrong. His results are not just valid on mondays or for years up to 2010. They are pretty general. It is pretty well known that identifying single weak measurement results without averaging with elements of reality is a fallacy.

We can discuss further if you have a valid objection to Vaidman's position, but I will not waste any further time explaining the basics if you do not even have the intention to understand them.
 
  • #94
What? You said:

For the last time: Tell me, where Vaidman is wrong. His results are not just valid on mondays or for years up to 2010. They are pretty general. It is pretty well known that identifying single weak measurement results without averaging with elements of reality is a fallacy.

Wrong about what? What are you talking about?

What does Vaidman have to do with the experiment carried out by Lundeen?

Here's more from Lundeen:

The wavefunction is the complex distribution used to completely
describe a quantum system, and is central to quantum theory. But
despite its fundamental role, it is typically introduced as an abstract
element of the theory with no explicit definition1,2. Rather, physicists
come to a working understanding of the wavefunction through its
use to calculate measurement outcome probabilities by way of the
Born rule3. At present, the wavefunction is determined through
tomographic methods4–8, which estimate the wavefunction most con-
sistent with a diverse collection of measurements. The indirectness of
these methods compounds the problem of defining the wave-
function. Here we show that the wavefunction can be measured
directly by the sequential measurement of two complementary vari-
ables of the system. The crux of our method is that the first measure-
ment is performed in a gentle way through weak measurement9–18,
so as not to invalidate the second. The result is that the real and
imaginary components of the wavefunction appear directly on our
measurement apparatus. We give an experimental example by
directly measuring the transverse spatial wavefunction of a single
photon, a task not previously realized by any method.


Again:

We give an experimental example by
directly measuring the transverse spatial wavefunction of a single
photon, a task not previously realized by any method.


You seem to dodge Lundeen like Superman dodges bullets. It's almost like you just stick your head in the sand and deny, deny, deny regardless of the facts. It's like the guy said at the conference where ensemble interpretations were accepted by 3% of the attendees.

However, you implicitly make statements that quantum mechanics can't achieve certain things – even though it can.

How many times does Lundeen have to say he measured the wave function of a single particle? Lundeen:

The benefit of this reduction in precision
is a commensurate reduction in the disturbance to the wavefunction of
the single photon.


In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


You keep talking about everything but Lundeen. Show me where Lundeen said he didn't directly measure the wave function of a single photon.
 
  • #95
Lundeen may be wrong. He is contradicted by Lundeeni:)

http://arxiv.org/abs/1112.3575

"Indeed, it is impossible to determine a completely unknown wavefunction of single system [20]."

"In contrast, we introduce a method to measure ψ of an ensemble directly."
 
Last edited:
  • #96
atyy said:
"Indeed, it is impossible to determine a completely unknown wavefunction of single system [20]."

And obviously so:

bhobba said:
If we observe a state with an apparatus that gives 0 if its not in that state and 1 if it is then the quantum formalism tells us that since states can be a superposition of those two outcomes it may be in a state that sometimes gives 0 and sometimes 1. To determine it is in that state you need to carry out the observation a sufficiently large number of times for the null result to be below your level of confidence - you can never be sure - all you can do is simply make the chances of being wrong arbitrarily small ie is zero for all practical purposes.

There is no 'argument' about it - if QM is correct YOU CAN'T DO IT - its a simple, almost trivial, result from its basic axioms.

Thanks
Bill
 
  • #97
matrixrising said:
Wrong about what? What are you talking about?

What does Vaidman have to do with the experiment carried out by Lundeen?

You claim that a single weak measurement result is meaningful in Lundeen's experiment. Vaidman says that a single weak measurement result is never meaningful. It is not hard to see the problem here.

matrixrising said:
You keep talking about everything but Lundeen. Show me where Lundeen said he didn't directly measure the wave function of a single photon.

Oh, I do not deny that Lundeen did that. I just say that your former claim shows that you do not know what these terms mean. There are 3 or 4 terms here that need to be treated with caution:

1) (not that much of a deal) measured: means weakly measured.
2) wave function: has been defined by Lundeen in his paper: "the average result of a weak measurement of a variable followed by a strong measurement of the complementary variable"
3) direct: means 'not by tomography'/'not by max likelihood reconstruction'. It does not mean something like a single shot measurement.
4) single photon: Can be interpreted in two correct ways here: First, as an ensemble of identically prepared single particle realizations. In this case, measurement means getting the full and accurate description of the ensemble. Second, Lundeen performs a weak measurement of the wave function on every single particle realization. This single result is meaningless on its own (see Vaidman) as it does not correspond to an element of reality, but measurements with meaningless results are of course still measurements. Version 1 is the more probable definition (see atyy's last post).

Can you tell me where Lundeen writes something that supports your position? So far I have not seen anything.

edit: You also seem to be implying that the ensemble interpretation says that qm cannot be applied to a single system or particle. This is of course also wrong. It may well be applied to a single system or particle, and predict what is the probability that that single system will have for a value of one of its properties, on repeated measurements. See the wikipedia entry on the ensemble interpretation or any modern article on it for more details.
 
Last edited:
  • #98
What experiments have been done to determine the speed of collapse in interpretation with collapse? For example, in double slit, when one side of slit screen detector detects the particle, the entire wave function collapse, so does the collapse travel at speed of light or instantaneous between the detectors in both slits? What experiments have been done akin to this to determine if it's instantaneous or travel at speed of light? Or can no experiment be done to determine it, why?
 
  • #99
kye said:
What experiments have been done to determine the speed of collapse in interpretation with collapse? For example, in double slit, when one side of slit screen detector detects the particle, the entire wave function collapse, so does the collapse travel at speed of light or instantaneous between the detectors in both slits? What experiments have been done akin to this to determine if it's instantaneous or travel at speed of light? Or can no experiment be done to determine it, why?

These days collapse is often associated with decoherence - it explains APPARENT collapse. I believe it happens VERY VERY quickly but can't recall the exact time scales off the top of my head. I do know it has been measured, but they had to arrange for it to be slower than usual to do it.

Undoubtedly a google search would yield more concrete figures.

Thanks
Bill
 
  • #100
bhobba said:
These days collapse is often associated with decoherence - it explains APPARENT collapse. I believe it happens VERY VERY quickly but can't recall the exact time scales off the top of my head. I do know it has been measured, but they had to arrange for it to be slower than usual to do it.

Undoubtedly a google search would yield more concrete figures.

Thanks
Bill

decoherence is not the same as collapse because it is just in mixed state and the born rule isn't invoked in decoherence, collapse is additional to decoherence... (don't you agree?) i think true collapse is related to to the term "dynamical collapse"
 
  • #101
kye said:
decoherence is not the same as collapse because it is just in mixed state and the born rule isn't invoked in decoherence, collapse is additional to decoherence... (don't you agree?) i think true collapse is related to to the term "dynamical collapse"

Notice I said APPARENT collapse.

As Von Neumann first proved actual collapse is totally nebulous since it can be placed anywhere. This means, without the constraint of apparent collapse its an unanswerable question.

Thanks
Bill
 
  • #102
bhobba said:
Notice I said APPARENT collapse.

As Von Neumann first proved actual collapse is totally nebulous since it can be placed anywhere. This means, without the constraint of apparent collapse its an unanswerable question.

Thanks
Bill

Oh, so dynamical collapse can't be tested because when the particle in one slit is detected, there is nothing on the other side and the state vector only manifest classical using classical instruments.. so this means we can't test if the wave function (should it be there) is instantaneous or with the limit of c, and there is no theorem to test it?
 
  • #103
kye said:
Oh, so dynamical collapse can't be tested because when the particle in one slit is detected, there is nothing on the other side and the state vector only manifest classical using classical instruments.. so this means we can't test if the wave function (should it be there) is instantaneous or with the limit of c, and there is no theorem to test it?

You can't do it because you don't know where or when it occurred - this is the Von-Neumann regress that led to that utterly weird idea of consciousness causing collapse.

Thanks
Bill
 
  • #104
kye said:
What experiments have been done to determine the speed of collapse in interpretation with collapse? For example, in double slit, when one side of slit screen detector detects the particle, the entire wave function collapse, so does the collapse travel at speed of light or instantaneous between the detectors in both slits? What experiments have been done akin to this to determine if it's instantaneous or travel at speed of light? Or can no experiment be done to determine it, why?

In the Copenhagen (or shut-up-and-calculate) interpretation, collapse is instantaneous.
http://arxiv.org/abs/quant-ph/9906034

There are physical collapse theories, but these make predictions that are different from quantum mechanics, and can be tested in principle. As far as we know, all observations to date are consistent with quantum mechanics (ie. instantaneous collapse).
 
  • #105
atyy said:
In the Copenhagen (or shut-up-and-calculate) interpretation, collapse is instantaneous.
http://arxiv.org/abs/quant-ph/9906034

There are physical collapse theories, but these make predictions that are different from quantum mechanics, and can be tested in principle. As far as we know, all observations to date are consistent with quantum mechanics (ie. instantaneous collapse).

does this violate special relativity or it doesn't in the same way quantum entanglement can't be used to send information faster than light.. so the wave function if it is really there can't be used to send information ftl...

one problem of an actual wave function without Bohmian's formalism but Heisenberg's is the problem of how does the particle in the double slit transform to wave at emission and before reaching slits, how does it know whether to change back to wave (as if predicting the slits in front). But then by not using the concept of particles and waves and wavecles perhaps who knows wavecles have really this ability, so what line of arguments do you have that shows this possibility to be untenable?
 

Similar threads

Replies
1
Views
688
  • Quantum Physics
2
Replies
61
Views
1K
  • Quantum Physics
Replies
16
Views
2K
  • Quantum Physics
2
Replies
36
Views
2K
  • Quantum Physics
Replies
1
Views
645
Replies
8
Views
1K
  • Quantum Physics
Replies
24
Views
1K
  • Quantum Physics
Replies
3
Views
359
Replies
32
Views
2K
Replies
4
Views
864
Back
Top