- #1
mPlummers
- 12
- 1
- Homework Statement
- I have a data point ##x_{true}##, without error. I also have a function ##f(x_{true})=y_{true}##. I have to obtain a simulated measurement, adding to this data point a relative error (and so i'll pass from ##x_{true}## to a ##x_{measurement}##). The goal is to find the amount of relative error to add to ##x_{true}## to obtain a measurement of ##y_{measurement}## with 1% precision.
- Relevant Equations
- Relative error: ##(y_{true}-y_{measurement})/y_{measurement}##
NOTE: this is a programming exercise (Python).
I started adding to ##x_{true}## an error related to a (for example) 10% relative error, obtaining ##x_{measurement}##. Then i computed ##y_{measurement}##. To find the precision, i calculated ##(y_{true}-y_{measurement})/y_{measurement}##. If it is correct, what i want to do is to use several relative error values, compute the precision and look for the closest to 1%.
I started adding to ##x_{true}## an error related to a (for example) 10% relative error, obtaining ##x_{measurement}##. Then i computed ##y_{measurement}##. To find the precision, i calculated ##(y_{true}-y_{measurement})/y_{measurement}##. If it is correct, what i want to do is to use several relative error values, compute the precision and look for the closest to 1%.