- #1
kahoomann
- 58
- 1
One way in which Heisenberg originally argued for the uncertainty principle is by using an imaginary microscope as a measuring device [2] he imagines an experimenter trying to measure the position and momentum of an electron by shooting a photon at it.
If the photon has a short wavelength, and therefore a large momentum, the position can be measured accurately. But the photon will be scattered in a random direction, transferring a large and uncertain amount of momentum to the electron. If the photon has a long wavelength and low momentum, the collision will not disturb the electron's momentum very much, but the scattering will reveal its position only vaguely.
If a large aperture is used for the microscope, the electron's location can be well resolved (see Rayleigh criterion); but by the principle of conservation of momentum, the transverse momentum of the incoming photon and hence the new momentum of the electron will be poorly resolved. If a small aperture is used, the accuracy of the two resolutions is the other way around.
The trade-offs imply that no matter what photon wavelength and aperture size are used, the product of the uncertainty in measured position and measured momentum is greater than or equal to a lower bound, which is up to a small numerical factor equal to Planck's constant.[4] Heisenberg did not care to formulate the uncertainty principle as an exact bound, and preferred to use it as a heuristic quantitative statement, correct up to small numerical factors.
Can we really use this disturbance theory to explain the Heisenberg uncertainty principle?
Does the Heisenberg uncertainty principle has anything to do with disturbance theory?
For example, if we measure the temperature of hot water, its temperature would change due to the measurement itself
If the photon has a short wavelength, and therefore a large momentum, the position can be measured accurately. But the photon will be scattered in a random direction, transferring a large and uncertain amount of momentum to the electron. If the photon has a long wavelength and low momentum, the collision will not disturb the electron's momentum very much, but the scattering will reveal its position only vaguely.
If a large aperture is used for the microscope, the electron's location can be well resolved (see Rayleigh criterion); but by the principle of conservation of momentum, the transverse momentum of the incoming photon and hence the new momentum of the electron will be poorly resolved. If a small aperture is used, the accuracy of the two resolutions is the other way around.
The trade-offs imply that no matter what photon wavelength and aperture size are used, the product of the uncertainty in measured position and measured momentum is greater than or equal to a lower bound, which is up to a small numerical factor equal to Planck's constant.[4] Heisenberg did not care to formulate the uncertainty principle as an exact bound, and preferred to use it as a heuristic quantitative statement, correct up to small numerical factors.
Can we really use this disturbance theory to explain the Heisenberg uncertainty principle?
Does the Heisenberg uncertainty principle has anything to do with disturbance theory?
For example, if we measure the temperature of hot water, its temperature would change due to the measurement itself
Last edited: