- #1
- 910
- 576
In the Hanbury-Brown and Twiss (HBT) family of experiments, they multiply the intensity signals from two detectors. Instead, what if we subract the signals? In this case, we should see the RMS type noise addition when the detectors are far apart, decreasing to a null when the detectors are nearly coincident. This may not be optimal, but would it work at all?