- #1
WarPhalange
I am doing a lab report, but this isn't a homework question per se...
I just wanted to know what the best way of calculating an error for a slope is. I am doing a Continuous Wave NMR lab and took some data with estimated uncertainties, and need to know how to propagate them into an error estimate of the slope the data forms.
Excel just gives me a slope by itself, but no error. The closest thing I could find was regression which was pointless because it didn't take into account my error estimates.
Then there was making your own lines and eye-balling an error estimate from the max and min slopes you can fit through your data. Makes sense, but that's way too wishy washy for me.
Besides that I can say f/B = gamma (what the slope should be), take that for each set of data points, take an average, and propagate that error... but can't Excel just do a damn uncertainty for me? Gah...
I just wanted to know what the best way of calculating an error for a slope is. I am doing a Continuous Wave NMR lab and took some data with estimated uncertainties, and need to know how to propagate them into an error estimate of the slope the data forms.
Excel just gives me a slope by itself, but no error. The closest thing I could find was regression which was pointless because it didn't take into account my error estimates.
Then there was making your own lines and eye-balling an error estimate from the max and min slopes you can fit through your data. Makes sense, but that's way too wishy washy for me.
Besides that I can say f/B = gamma (what the slope should be), take that for each set of data points, take an average, and propagate that error... but can't Excel just do a damn uncertainty for me? Gah...