- #1
evangoor
- 4
- 0
I am currently taking a quantum physics lab. The results from my most recent lab are troubling me.
A quick description of the lab
The setup includes an incandescent light bulb which feeds light into a monochromator. The monochromator consist of a couple mirrors and a diffraction grating. The diffraction grating is rotated by a dial to select a specific wavelength. The selected light then shines into a photodiode which has a variable potential applied across it. The stopping potential is then measured by varying the voltage until the current is 0.
My problem
When we plotted stopping potential vs. position our result was not linear. The dial position should be directly proportional to wavelength and hence have a linear relationship to frequency. However our measurements showed a relationship of at least the third order.
An attempt at finding the source of the problem
I thought maybe if the diffraction grating wasn't perfectly flat, the relationship between dial position and wavelength wouldn't be perfectly linear. When setting up the experiment we used a mercury lamp for establishing a calibration curve so we could convert dial readings to a wavelength. So to try to account for a distortion caused by the grating, I established a second order fit of our calibration data. I then used that equation to calculate a frequency value for each of the dial positions. I plotted potential vs. frequency and still did not get a linear relationship. At this point I can't think of a reasonable source for this kind of error. I was thinking it probably has something to do with the detector, but unfortunately I cannot get a good description of the exact construction of the detector. I would appreciate any new insight you might be able to offer.
A quick description of the lab
The setup includes an incandescent light bulb which feeds light into a monochromator. The monochromator consist of a couple mirrors and a diffraction grating. The diffraction grating is rotated by a dial to select a specific wavelength. The selected light then shines into a photodiode which has a variable potential applied across it. The stopping potential is then measured by varying the voltage until the current is 0.
My problem
When we plotted stopping potential vs. position our result was not linear. The dial position should be directly proportional to wavelength and hence have a linear relationship to frequency. However our measurements showed a relationship of at least the third order.
An attempt at finding the source of the problem
I thought maybe if the diffraction grating wasn't perfectly flat, the relationship between dial position and wavelength wouldn't be perfectly linear. When setting up the experiment we used a mercury lamp for establishing a calibration curve so we could convert dial readings to a wavelength. So to try to account for a distortion caused by the grating, I established a second order fit of our calibration data. I then used that equation to calculate a frequency value for each of the dial positions. I plotted potential vs. frequency and still did not get a linear relationship. At this point I can't think of a reasonable source for this kind of error. I was thinking it probably has something to do with the detector, but unfortunately I cannot get a good description of the exact construction of the detector. I would appreciate any new insight you might be able to offer.