- #1
dichotomy
- 24
- 0
we've a basic numerical integration assignment, using simpsons/trapezium rule to calculate some non analytical function.
Logic suggests that you use as many "strips" as possible (ie. as small a value of dx as possible) to estimate the value of the integral, however I've seen some graph some time ago showing how error can actually start to increase with smaller dx's, because of floating point rounding, and the inherent error of the methods. anyone seen/got anything like this (for a 32bit computer??)
Logic suggests that you use as many "strips" as possible (ie. as small a value of dx as possible) to estimate the value of the integral, however I've seen some graph some time ago showing how error can actually start to increase with smaller dx's, because of floating point rounding, and the inherent error of the methods. anyone seen/got anything like this (for a 32bit computer??)