Geiger counters and measurement

  • I
  • Thread starter jeeves
  • Start date
  • Tags
    Measurement
In summary, The atom is in a superposition of states, a(t)*N + b(t)*D, where a(0) = 1 and b(0) = 0. The state at t=100 is the result of the particle evolving until time t=100, and the Geiger counter not clicking.
  • #71
PeterDonis said:
You can, of course, write things in terms of a QM wave function with coefficients ##A(t)## and ##B(t)## in front of the "non-decayed" and "decayed" terms, where ##B(t)## is determined by the radioactive decay law and ##A(t)## is determined by normalization (the squared norm of the wave function as a whole must be 1). But this doesn't tell you anything new as far as probabilities go that the radioactive decay law doesn't already tell you. And if you're not picking any particular QM interpretation, you have no reason to even bother writing down the wave function in the first place since you're not assigning it any physical meaning and it doesn't add any ability to predict probabilities that you don't already have.
Yes, I would like to write things in terms of a wave function with coefficients ##A(t)## and ##B(t)##. In particular, I would like to know whether what I wrote in post #59 is correct (regarding both the coefficients themselves and the reasoning used to reach them).

I disagree that if I am not picking a particular interpretation, there is no reason to do this. The reason is to see if my understanding of the QM formalism is correct. This knowledge will be useful for when I consider more complicated situations, where I do not have prior knowledge of the probabilities or a suitable classical approximation.
 
Physics news on Phys.org
  • #72
jeeves said:
Sure. As I said in post #64, I just want to know how to apply the math appropriately to correctly predict the empirically observed outcomes. We make predictions in QM by constructing a wave function and reading off the desired probabilities. I am asking only how to construct the wave function appropriately.
Usually you would be dealing with some specific microscopic phenomenon. Most introductory texts deal with the Hydrogen atom and the prediction of its spectrum. That's a classic solution to the SDE for the energy eigenstates (wavefunctions).

But, QM is not just about wavefunctions. If you want to try some genuine QM, there's an accessible and insightful treatment at undergraduate level here:

http://physics.mq.edu.au/~jcresser/Phys304/Handouts/QuantumPhysicsNotes.pdf
 
  • #73
I found the following quote in the paper
Peres, Asher (1989). Quantum limited detectors for weak classical signals. Physical Review D, 39(10), 2943–2950.

He writes of quantum Zeno and continuous measurements that:
This problem is peculiar to quantum systems. It disappears
in the semiclassical limit, where eigenvalues become extremely dense.
From the quantum point of view,
classical measurements are always fuzzy. This is why a
watched pot may boil, after all: the observer watching it
is unable to resolve the energy levels of the pot. Any hy-
pothetical device which could resolve these energy levels
would also radically alter the behavior of the pot. Like-
wise, the mere presence of a Geiger counter does not
prevent a radioactive nucleus from decaying. The
Geiger counter does not probe the energy levels of the
nucleus (it interacts with decay products whose Hamiltonian
has a continuous spectrum). As the preceding calculations show,
peculiar quantum effects, such as the
Zeno "paradox" occur only when individual levels are
resolved (or almost resolved).

This explanation is consistent with Nugatory's (and others) given earlier.

What does the parenthetical "it interacts with decay products whose Hamiltonian has a continuous spectrum" contribute to the explanation? Why is the fact that the Hamiltonian has continuous spectrum relevant to the argument that a Zeno effect cannot occur? The Hamiltonian of the decay product is only relevant after the decay happens, and once decay happens, we no longer have to worry about a Zeno effect.
 
  • #74
jeeves said:
Yes, I would like to write things in terms of a wave function with coefficients ##A(t)## and ##B(t)##. In particular, I would like to know whether what I wrote in post #59 is correct (regarding both the coefficients themselves and the reasoning used to reach them).
With just two coefficients, you are dealing with a two-level system. There is no way that the solution can have any semblance to the decay of a radioactive atom. You cannot just assume coefficients. How would you specify the Hamiltonian? Have you ever done some simple exercises in QM?

jeeves said:
What does the parenthetical "it interacts with decay products whose Hamiltonian has a continuous spectrum" contribute to the explanation? Why is the fact that the Hamiltonian has continuous spectrum relevant to the argument that a Zeno effect cannot occur? The Hamiltonian of the decay product is only relevant after the decay happens, and once decay happens, we no longer have to worry about a Zeno effect.
Do you know Fermi's Golden Rule? Have you looked at its derivation? Although the Born rule is usually stated as relating to "measurements", Fermi's Golden Rule is routinely applied in situations where it doesn't make sense to talk about measurements or observations. For example in the calculation of nuclear reaction rates in the interior of stars. And for the derivation of Fermi's Golden Rule it is irrelevant when (or if at all) a measurement is made. "Measurement" and "wave function collapse" still linger prominently in textbooks, but IMHO they are not actually a crucial part of the formalism. John Bell, in his essay "Against Measurement" railed against treating "measurement" as a fundamental process. I agree with him that measurements should be explained in terms of something more fundamental.
 
  • Like
Likes gentzen
  • #75
jeeves said:
Why is the fact that the Hamiltonian has continuous spectrum relevant to the argument that a Zeno effect cannot occur?
Because no classical measurement can ever resolve all energy levels of a continuous measurement. At least that is the argument Asher Peres gives in your quote:
From the quantum point of view, classical measurements are always fuzzy. This is why a watched pot may boil, after all: the observer watching it is unable to resolve the energy levels of the pot.
...
As the preceding calculations show, peculiar quantum effects, such as the Zeno "paradox" occur only when individual levels are resolved (or almost resolved).

For the more detailed explanation, he refers to "the preceding calculations".
 
  • #76
WernerQH said:
With just two coefficients, you are dealing with a two-level system. There is no way that the solution can have any semblance to the decay of a radioactive atom. You cannot just assume coefficients. How would you specify the Hamiltonian? Have you ever done some simple exercises in QM?
I am merely repeating the analysis in post #12 of this thread. Do you disagree with that analysis?
gentzen said:
For the more detailed explanation, he refers to "the preceding calculations".
The preceding calculations do not seem to have a direct bearing on this particular example. The only relevant comment is:
One can even consider a passive detector,
such as a Geiger counter waiting
for the decay of a nucleus, but this situation does not fit
at all with our definition of a measurement. This setup is
best described as a single metastable system with several
decay channels.
 
  • #77
jeeves said:
I would like to write things in terms of a wave function with coefficients ##A(t)## and ##B(t)##.
I described how to do that in what you quoted, including how to determine ##B(t)## and ##A(t)##.

jeeves said:
In particular, I would like to know whether what I wrote in post #59 is correct (regarding both the coefficients themselves and the reasoning used to reach them).
As far as I can tell, your post #59 is saying that ##A(t)## and ##B(t)## get determined by the method I described in what you quoted (the radioactive decay law for ##B(t)## and normalization for ##A(t)##) until the cat is observed to die; at that point we deduce that the atom has decayed and the state collapses to ##A = 0##, ##B = 1## (the "atom decayed, cat dead" state). This is correct (as long as you are careful not to assign any physical meaning to "the state collapses" and to use it as a mathematical method only, to tell you what state you will now use for future predictions).

jeeves said:
The reason is to see if my understanding of the QM formalism is correct.
The QM formalism in this case is so simple that, as I said before, it doesn't really add anything to your knowledge--including to your knowledge of how to work with the QM formalism for more complicated cases.

The best simple experiment I know of to exercise your QM formalism-fu is an experiment on a pair of entangled particles to test for violations of the Bell inequalities.
 
  • #78
PeterDonis said:
As far as I can tell, your post #59 is saying that ##A(t)## and ##B(t)## get determined by the method I described in what you quoted (the radioactive decay law for ##B(t)## and normalization for ##A(t)##) until the cat is observed to die; at that point we deduce that the atom has decayed and the state collapses to ##A = 0##, ##B = 1## (the "atom decayed, cat dead" state). This is correct (as long as you are careful not to assign any physical meaning to "the state collapses" and to use it as a mathematical method only, to tell you what state you will now use for future predictions).
Great, thank you. This completely answers my original question. I will look into the Bell inequality violations; thank you for that also.

My only remaining worry is about that comment of Peres. As far as I can tell, he seems to be arguing that:
  1. The Geiger counter does not probe the energy levels of the nucleus, only the energy of the decay product after the nucleus has decayed.
  2. The decay product has continuous spectrum, and Zeno effects only apply in the presence of a discrete spectrum.
  3. Therefore, there can't be a quantum Zeno effect.
But I don't see why we need step two. It seems valid to say that:
  1. The Geiger counter does not probe the energy levels of the nucleus, only the energy of the decay product after the nucleus has decayed.
  2. Since (for all practical purposes) the Geiger counter does not entangle with the nucleus before decay, the counter cannot prevent decay, and there can be no quantum Zeno effect.
So the continuity of the spectrum seems irrelevant.
 
Last edited:
  • #79
jeeves said:
My only remaining worry is about that comment of Peres.
I don't think the Peres comment that was quoted really applies to the Geiger counter case. Or, for that matter, the cat case we have been discussing.

In the case of the counter and the cat, observation of those systems is only an indirect proxy for what we are interested in, namely, whether or not some particular atom has decayed. And because of the way those scenarios are set up, there is no meaningful "quantum Zeno" interaction between the counter, or the cat, and the atom we are interested in, until the atom decays. And that is true regardless of what the atom's energy levels are, whether they are continuous or discrete, how closely spaced they are, etc.

The kind of thing Peres is talking about is something different; he is talking about something like comparing measuring an atom in some unstable/metastable state directly vs. observing whether a cat is alive or dead. In both cases, you get something like a "binary" result (not decayed/decayed, alive/dead), but in the case of the atom, if you are measuring it directly (rather than relying on an indirect proxy like a Geiger counter), your measurement can involve an interaction (say probing the atom with a laser) that does have a "quantum Zeno" effect on the atom. But that is possible only if the states of the atom you are trying to distinguish are "spaced far enough apart", so to speak, that your measurement can reliably tell one from the other (or more precisely can reliably collapse the atom into one or the other).

In the case of observing a cat to see whether it is alive or dead, no such "quantum Zeno" effect is possible--you can't keep a cat alive just by constantly watching it. And that is because, unlike a single atom, a cat has an enormous number of possible states, which are "spaced very close together", so to speak, and what we refer to as "alive" and "dead" are not single states of the cat but huge subspaces of the cat's state space, and our observations cannot reliably force the cat into just one single state; we can't "collapse" the cat into some single desired state in its "alive" state space (which is what we would have to do to have a "quantum Zeno" effect on the cat) by observing it. In fact we can't do that by any means we have at our disposal now or in the foreseeable future.

To put this another way: probing a single atom with a probe laser can have a significant effect on its dynamics, but looking at a cat and seeing that it's alive does not have any significant effect on its dynamics. That's why we can do quantum Zeno experiments with atoms but not with cats.
 
  • Like
Likes PeroK, gentzen, jeeves and 1 other person
  • #80
How does Peres's account for the disappearance of the zeno effect in "detection of decay product" experiments (see post #73) inform the zeno paradox given by Home and Whitaker?

They establish the paradox in a thought experiment involving a hollow sphere of inward-facing detectors surrounding the atom in question. They set the radius of the sphere to be very large, and define a measurement as a rapid contraction of the sphere, such that if the particle has decayed (survived), the products will be detected (not be detected) during the contraction, and hence one of the two possible energy levels of the atom will be registered. A correlation is therefore established between detector and energy of the atom even if no decay product was detected, and without any ambiguity of continuous measurement. They claim that, using only orthodox quantum theory, the choice of sequences of measurements would affect the decay statistics even though no direct interaction takes place.
 
Last edited:
  • Like
Likes jeeves
  • #81
Morbert said:
How does Peres's account for the disappearance of the zeno effect in "detection of decay product" experiments (see post #73) inform the zeno paradox given by Home and Whitaker?
Even so I disagree with vanhess71 that this entire thread should have been in the Quantum Interpretations and Foundations subforum, the abstract of that paper feels suspiciously like a typical interpretation paradox:
... Gedanken experiments are outlined which illustrate the key features of the paradox, and its implications for the realist interpretation are discussed.

I know that the abstract also says: "It is demonstrated that collapse of state-vector is not a requirement for the paradox, which is independent of interpretation of quantum theory." But that statement too has more to do with interpretation than with plain "calculate and be happy" quantum mechanics, from my POV.
 
  • #82
Morbert said:
How does Peres's account for the disappearance of the zeno effect in "detection of decay product" experiments (see post #73) inform the zeno paradox given by Home and Whitaker?

They establish the paradox in a thought experiment involving a hollow sphere of inward-facing detectors surrounding the atom in question. They set the radius of the sphere to be very large, and define a measurement as a rapid contraction of the sphere, such that if the particle has decayed (survived), the products will be detected (not be detected) during the contraction, and hence one of the two possible energy levels of the atom will be registered. A correlation is therefore established between detector and energy of the atom even if no decay product was detected, and without any ambiguity of continuous measurement. They claim that, using only orthodox quantum theory, the choice of sequences of measurements would affect the decay statistics even though no direct interaction takes place.
I couldn't find a free version of the reference. I did find a similar paper by the same 2 authors:

https://www.fi.muni.cz/usr/buzek/zaujimave/home.pdf
A Conceptual Analysis of Quantum Zeno; Paradox, Measurement, and Experiment

I don't see where they use QM to make a testable prediction that decay rate is in any way affected by the presence (or lack thereof) of Geiger counters (measurement devices). Given these papers were written 25+ years ago, I would think we would have heard about that if this novel prediction had made a splash. On the other hand, a more recent analysis of the Zeno effect via indirect measurement finds there is no such effect (in contradiction).

https://arxiv.org/abs/quant-ph/0406191
"We study the quantum Zeno effect in the case of indirect measurement, where the detector does not interact directly with the unstable system. Expanding on the model of Koshino and Shimizu [Phys. Rev. Lett., 92, 030401, (2004)] we consider a realistic Hamiltonian for the detector with a finite bandwidth. We also take explicitly into account the position, the dimensions and the uncertainty in the measurement of the detector. Our results show that the quantum Zeno effect is not expected to occur, except for the unphysical case where the detector and the unstable system overlap."

A quantum system (such as a radioactive isotope) does not change (its statistical decay rate) in any way due to the presence or absence of an indirect measurement system designed to detect a decay product. Interestingly, it does, however, react to gravitational effects (time dilation).
 
  • Informative
  • Like
Likes Morbert and PeroK
  • #83
The Home and Whitaker paper references a 1980 paper by Peres which gives a time-evolution that is "valid for small times" $$|\left(\phi,e^{-iHt}\phi\right)|^2 \approx 1-(\Delta H)^2t^2+\cdots$$They use this in their derivation of the Zeno's paradox, which is peculiar to me. When I use the more general time evolution of an exponential decay, everything works out fine.
 
  • #84
Morbert said:
The Home and Whitaker paper references a 1980 paper by Peres which gives a time-evolution that is "valid for small times" $$|\left(\phi,e^{-iHt}\phi\right)|^2 \approx 1-(\Delta H)^2t^2+\cdots$$They use this in their derivation of the Zeno's paradox, which is peculiar to me. When I use the more general time evolution of an exponential decay, everything works out fine.
Yes, the exponential distribution is memoryless and cannot produce a Zeno effect. The effect relies on the deviations from exponential decay at short times. Quoting Wikipedia:
Unstable quantum systems are predicted to exhibit a short-time deviation from the exponential decay law. This universal phenomenon has led to the prediction that frequent measurements during this nonexponential period could inhibit decay of the system, one form of the quantum Zeno effect.
References are available in the article, and the calculation is done in equations (11) through (15) of the Home–Whitaker article you posted. I think it's also discussed somewhere in Sakurai, but my memory may be unreliable.
 
  • #85
jeeves said:
1. Yes, the exponential distribution is memoryless and cannot produce a Zeno effect.

2. The effect relies on the deviations from exponential decay at short times. Quoting Wikipedia: ...
1. Agreed.

2. I would not agree that there is such a deviation for radioactive decay. As far as I know, there isn't any indirect quantum measurement effect that would inhibit decay even in that program ("short" times).

In fact, I am not sure that there is a hypothetical short time program (regardless of what Wikipedia says). I'm also not sure what *direct* quantum Zeno effects have been studied at any level regarding radioactive decay.
 
  • #86
I am wondering if the model for time-evolution used in the [Home + Whitaker] papers above just isn't correct with these negative-measurement thought experiments. Assuming like before we have a system (##\phi##) and passive detector (##\psi##) that evolves like so $$U(t)|\phi_0, \psi_0\rangle = \alpha(t)|\phi_s, \psi_s\rangle + \beta(t)|\phi_d, \psi_d\rangle$$ The probability that the atom has survived until some time ##T## is $$\langle\phi_0,\psi_0|U^\dagger(T)|\phi_s,\psi_s\rangle\langle\phi_s,\psi_s|U(T)|\phi_0,\psi_0\rangle = |\alpha(T)|^2$$I can add an identity operator, evolved to some intermediate time ##t## without changing the result like so $$\langle\phi_0,\psi_0|I^\dagger(t)U^\dagger(T)|\phi_s,\psi_s\rangle\langle\phi_s,\psi_s|U(T)I(t)|\phi_0,\psi_0\rangle = |\alpha(T)|^2$$ But the identity operator can be projectively decomposed ##I = |\phi_s,\psi_s\rangle\langle\phi_s,\psi_s| + |\phi_d,\psi_d\rangle\langle\phi_d,\psi_d|## which is exactly what we would do to compute probabilities for an intermediate measurement. This intermediate measurement therefore can't change the probability computed for survival of the atom at ##T##, so long as we have a valid ##U##.
 
Last edited:
  • #87
DrChinese said:
https://arxiv.org/abs/quant-ph/0406191
"We study the quantum Zeno effect in the case of indirect measurement, where the detector does not interact directly with the unstable system. Expanding on the model of Koshino and Shimizu [Phys. Rev. Lett., 92, 030401, (2004)] we consider a realistic Hamiltonian for the detector with a finite bandwidth. We also take explicitly into account the position, the dimensions and the uncertainty in the measurement of the detector. Our results show that the quantum Zeno effect is not expected to occur, except for the unphysical case where the detector and the unstable system overlap."

A quantum system (such as a radioactive isotope) does not change (its statistical decay rate) in any way due to the presence or absence of an indirect measurement system designed to detect a decay product. Interestingly, it does, however, react to gravitational effects (time dilation).
Interesting paper. Thanks!
 
  • #88
DrChinese said:
I would not agree that there is such a deviation for radioactive decay. As far as I know, there isn't any indirect quantum measurement effect that would inhibit decay even in that program ("short" times).

In fact, I am not sure that there is a hypothetical short time program (regardless of what Wikipedia says). I'm also not sure what *direct* quantum Zeno effects have been studied at any level regarding radioactive decay.

There must be. An exact exponential decay law at all times is inconsistent with the laws of quantum mechanics. Refer to: Greenland, P. T. Seeking non-exponential decay. Nature 335, 298 (1988).

Further, there is indeed a program of theoretical predictions and attempted experimental tests of deviations from exponential decay in various contexts. For experimental evidence in quantum tunneling, see: Experimental evidence for non-exponential decay in quantum tunnelling, Nature 1997. I don't think experimental evidence has been found for radioactive decay specifically yet, but following the references in those papers (and citations to those paper) will lead you to theoretical work on the issue.

Morbert said:
Interesting paper. Thanks!

My inclination is to say that Home and Whitaker are simply wrong about their thought experiment. We have come to the conclusion in this thread that when the detector is not moving (e.g. it is a cat in a box), non-observation does not count as a "measurement" and will not produce a quantum Zeno effect, because the cat/detector does not directly probe the nucleus, it only interacts with the decay product after the decay.

If you accept that explanation, I am not sure how moving the detector around in space changes anything. Non-detection still won't count as a measurement because non-detection still doesn't produce an interaction with the nucleus (or anything else in that thought experiment).
 
Last edited:
  • #89
jeeves said:
1. An exact exponential decay law at all times is inconsistent with the laws of quantum mechanics. Refer to: Greenland, P. T. Seeking non-exponential decay. Nature 335, 298 (1988).

1. Here is a link to your reference:

https://www.nature.com/articles/335298a0

However, it's not really a good reference for your point, as I question whether Greenland's analysis is generally accepted (not well cited). When Greenland compared his prediction to actual relevant experiment, he found no support:

"Neither isotope revealed any deviation from exponential decay, and nor has any other test."

His actual prediction was: "In fact the decay of an isolated quantum state can never be exponential." So, there's that.

My point in all this is that deviation from exponential decay really has nothing to do with whether or not a Geiger counter type measurement can induce a quantum Zeno effect in a radioactive sample (as the OP seems to suggest). It cannot.2. I could not find a free link to your second reference. But I found another from the similar author group that might be of interest:

https://arxiv.org/abs/quant-ph/0104035
Observation of the Quantum Zeno and Anti-Zeno effects in an unstable system
"We report the first observation of the Quantum Zeno and Anti-Zeno effects in an unstable system. Cold sodium atoms are trapped in a far-detuned standing wave of light that is accelerated for a controlled duration. For a large acceleration the atoms can escape the trapping potential via tunneling. Initially the number of trapped atoms shows strong non-exponential decay features, evolving into the characteristic exponential decay behavior. We repeatedly measure the number of atoms remaining trapped during the initial period of non-exponential decay. Depending on the frequency of measurements we observe a decay that is suppressed or enhanced as compared to the unperturbed system."

This effect results from direct "measurement", rather than something indirect (like a Geiger counter).
 
  • Like
Likes PeroK, PeterDonis and Lord Jestocost
  • #90
DrChinese said:
1. Here is a link to your reference:

https://www.nature.com/articles/335298a0

However, it's not really a good reference for your point, as I question whether Greenland's analysis is generally accepted (not well cited). When Greenland compared his prediction to actual relevant experiment, he found no support:

"Neither isotope revealed any deviation from exponential decay, and nor has any other test."

His actual prediction was: "In fact the decay of an isolated quantum state can never be exponential." So, there's that.

Here is a link to the second reference: https://web.archive.org/web/2010033...chargement/Optique_Quantique/Raizen_decay.pdf

The Greenland article is a short expository article, and I would not expect it to be highly cited.

Note that the Khalfin paper cited in the paper I just linked (reference 1) predicting corrections to exponential decay has 635 citations (according to Google scholar). This is also discussed in reference 3 (708 citations), and some others. I believe the observation that the decay cannot be exactly exponential is even made in standard graduate textbooks; the Nature article cites Ballentine's book. I conclude that deviations from exponential decay are generally accepted by the physics community.

I agree that direct experimental evidence of these deviations does not seem to exist (to my knowledge). This appears to be a limitation of our experimental abilities (specifically probing short enough time scales), not of our theoretical understanding. I would reconsider my view if there were a paper that claims to access the time scales where quantum Zeno-permitting deviations are predicted for radioactive decay and finds no effect. However, my understanding is that no such paper exists, and my response is then simply that absence of evidence is not evidence of absence.

I agree these deviations are irrelevant to the Geiger counter example.
 
  • #91
It's indeed a mathematical fact that the exponential-decay law is an approximation. It is derived from 1st-order perturbation theory, neglecting the reaction between the decay products and leads to Breit-Wigner distributions for the transition amplitudes in energy-representation, which means to the exponential decay law in the time domain. It's precisely this neglect of the reaction between the decay products that leads to the exponential decay law, i.e., it neglects the possibility to go back to the original state after decay. This is a good approximation in many cases, but, e.g., not for cases, where the decay is "close to threshold", as demonstrated in the papers cited above. I think the Nature article nicely summarizes this state of affairs.
 
  • Like
  • Informative
Likes protonsarecool, jeeves, hutchphd and 1 other person
  • #92
jeeves said:
We have come to the conclusion in this thread that when the detector is not moving (e.g. it is a cat in a box), non-observation does not count as a "measurement" and will not produce a quantum Zeno effect, because the cat/detector does not directly probe the nucleus, it only interacts with the decay product after the decay.
It seems to be the case that observed Zeno effects are indeed due to perturbations on the system that take place during direct measurements (e.g. acceleration of sodium atoms). But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.

Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

[*] - By good quantum theory I mean suitable dynamics (including the dynamics that entangle measured and measurement device) and initial state
 
Last edited:
  • Like
Likes vanhees71
  • #93
Morbert said:
But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.

Why do you assume that such a theory does not already exist? As one interacts with the real world life gets complicated and theories more complicated.
I am also at a loss as to why one would not expect repeated measurement to render Breit-Wigner result insufficient as previously pointed out by @vanhees71.
How this has anything to do with "interpretation" is certainly beyond my ken.
 
  • #94
Morbert said:
Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

You may find Section X of the following paper useful: https://arxiv.org/abs/quant-ph/0611067

To better understand the nature of continuous mea-
surements, we will now consider in detail an example of
how a continuous measurement of position arises in a fun-
damental physical system: a single atom interacting with
light. Again, to obtain weak measurements, we do not
make projective measurements directly on the atom, but
rather we allow the atom to become entangled with an
auxiliary quantum system—in this case, the electromag-
netic field—and then make projective measurements on
the auxiliary system (in this case, using a photodetector).
It turns out that this one level of separation between the
system and the projective measurement is the key to the
structure of the formalism. Adding more elements to the
chain of quantum-measurement devices does not change
the fundamental structure that we present here.

The authors say that measuring the EM field is only a "weak measurement" of the state of the atom, so it won't lead to quantum Zeno. I guess this is essentially the "you're measuring the decay product, not the atom" claim being detailed with more math.

What I'm curious about is: Why does measuring the EM field for photons constitute a "weak measurement" (see definition on page 4, left column)? The reasoning in Section X seems valid, but it's not clear to me how the measurement is fuzzy in a way that allows them to avoid quantum Zeno. After all, isn't it the case that knowing the state of the EM field allows you to precisely identify the state of the two-level atom?
 
  • #95
Morbert said:
It seems to be the case that observed Zeno effects are indeed due to perturbations on the system that take place during direct measurements (e.g. acceleration of sodium atoms). But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.
Yes, and this is indeed what QT does since its discovery in 1925 ;-). The Zeno effect is due to interactions of the system including the measurement devices and the "environment". Measurement devices are also nothing special but consist of the building blocks of matter as the observed object and it obeys no special "rules" other than the rules described by QT. What else should it be and how else should it behave?
Morbert said:
Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

[*] - By good quantum theory I mean suitable dynamics (including the dynamics that entangle measured and measurement device) and initial state
It's just the standard dynamics described in any valid textbook of quantum theory.
 
  • Like
Likes protonsarecool
  • #96
jeeves said:
What I'm curious about is: Why does measuring the EM field for photons constitute a "weak measurement" (see definition on page 4, left column)? The reasoning in Section X seems valid, but it's not clear to me how the measurement is fuzzy in a way that allows them to avoid quantum Zeno. After all, isn't it the case that knowing the state of the EM field allows you to precisely identify the state of the two-level atom?
I think the fuzziness they are referring to here is fuzziness in the moment the particle decays as opposed to fuzziness in the energy of the two-level atom. I.e. Instead of

"A detector clicking at time ##t## implies the particle decayed at time ##t'##"

We say

"a detector clicking at time ##t## implies the particle probably decayed at time ##t'\pm\Delta t##"

where the 'probably' gets weakers as ##\Delta t## is made smaller. At the end of post #56 I remarked that I did not think a continuous measurement (taking ##\Delta t \rightarrow 0##) could be normalised. They have introduced a fuzziness (weakness of measurement as ##\Delta t\rightarrow 0##) so that a norm-preserving time-evolution can be obtained. This would presumably not satisfy Home+Whitaker since they derive a Zeno effect from non-continuous, discrete sequences of measurements.
hutchphd said:
Why do you assume that such a theory does not already exist? As one interacts with the real world life gets complicated and theories more complicated.
I am also at a loss as to why one would not expect repeated measurement to render Breit-Wigner result insufficient as previously pointed out by @vanhees71.
How this has anything to do with "interpretation" is certainly beyond my ken.
I wouldn't. I am instead saying that papers like the Whitaker one above are not using a correct quantum theory, and hence they are predicting statistics contingent on a sequence of indirect measurements when there should be no such contingency.
vanhees71 said:
Yes, and this is indeed what QT does since its discovery in 1925 ;-).

It's just the standard dynamics described in any valid textbook of quantum theory.
To be clearer and more formal, by quantum theory of the system, I meant dynamics ##U(t)## and initial state ##\rho_0## such that if ##\Pi_s## is the "detector has not clicked" projector then $$\mathrm{Tr}\left[\Pi_s(T)\rho_0\Pi^\dagger_s(T)\right] = \mathrm{Tr}\left[\Pi_s(T)\Pi_s(T/2)\rho_0\Pi^\dagger_s(T/2)\Pi^\dagger_s(T)\right] = \mathrm{Tr}\left[\Pi_s(T)\dots\Pi_s(T/n)\rho_0\Pi^\dagger_s(T/n)\dots\Pi^\dagger_s(T)\right]$$Home and Whitaker were presenting a dynamics where this did not hold, and I think their problem, as corroberated by posts by you DrChinese et al, is their dynamics are not actually correct.
 
  • Like
Likes PeroK
  • #97
The time evolution is due to the Hamiltonian of the system not by some projectors. If the interaction between the detector and the system under consideration is relevant for its time evolution, you have to include the corresponding interaction Hamiltonian.
 
  • Like
Likes hutchphd
  • #98
vanhees71 said:
The time evolution is due to the Hamiltonian of the system not by some projectors. If the interaction between the detector and the system under consideration is relevant for its time evolution, you have to include the corresponding interaction Hamiltonian.
I'm using the convention ##\Pi_s(t) = U^{-1}(t)\Pi_sU(t)##. And yes I agree the dynamics have to include degrees of freedom of the system being entangled with the atom.
 
  • Like
Likes vanhees71
  • #99
Morbert said:
Home and Whitaker were presenting a dynamics where this did not hold, and I think their problem, as corroberated by posts by you DrChinese et al, is their dynamics are not actually correct.
I've done some more reading and I think the majority of what has been said in this thread so far is not totally correct (including the analysis I gave a few pages ago).

I believe a correct analysis of a decaying atom with a continuous photodetector is given in "Quantum Zeno effect with general measurements" by Koshino and Shimizu, Physics Reports, 2005. It has hundreds of citations, so I conclude that it is "generally accepted physics." An arxiv version is here.

Section 5 contains the main argument. They show that for a continuous photodetector with small enough response time (much smaller than the initial time period where the atom decays non-exponentially) there is indeed a Zeno effect. The reason we do not see Zeno effects in day-to-day experiments is that the response times of typical photodetectors are many orders of magnitude larger than what would be necessary to get these effects. Their argument includes "indirect measurement," and they show how to extend it to non-continuous, discrete measurements at small time intervals.

I think with trivial changes you can analyze the Home–Whitaker setup in the same way, and you will indeed see a Zeno effect from those successive indirect measurements if the time interval is small enough.
 
  • Like
Likes Morbert
  • #100
jeeves said:
I think with trivial changes you can analyze the Home–Whitaker setup in the same way, and you will indeed see a Zeno effect from those successive indirect measurements if the time interval is small enough.
The Koshino paper is interesting, particularly in the way it dissolves the distinction between a direct and indirect measurement, so we might indeed see a Zeno effect in the Home-Whitaker thought experiment involving the hollow sphere of detectors. The Koshino paper shows this would not in fact be paradoxical because each contraction of the hollow sphere of detectors perturbs the atom, since the atom and detector are coupled via the photon field. Different sequences of indirect measurements = different dynamics = different decay statistics.
 
  • Like
Likes vanhees71

Similar threads

  • Quantum Physics
Replies
22
Views
2K
Replies
9
Views
1K
  • Quantum Physics
Replies
13
Views
1K
Replies
1
Views
637
Replies
8
Views
1K
  • Introductory Physics Homework Help
Replies
1
Views
36
  • Quantum Physics
5
Replies
143
Views
6K
  • Quantum Physics
Replies
4
Views
965
Replies
1
Views
475
Replies
1
Views
1K
Back
Top