- #1
Werg22
- 1,431
- 1
In a book of mine, the author proceeds to the proof that a Riemann sum in a interval [a,b] must converge by proving that for S_m and S_n (m>n) where the span of the subdivisions is suffiencienly small, then
|S_m - S_n)| < e(b-a)
Where e can assume infinitly small values in dependence of the span.
Now I understand why S_m has to be bounded, however I do not see an argument strong enough for convergeance - couldn't S_m assume constantly changing lower or higher values within a certain interval? That certainly could satisfy the inequality above... What am I missing?
|S_m - S_n)| < e(b-a)
Where e can assume infinitly small values in dependence of the span.
Now I understand why S_m has to be bounded, however I do not see an argument strong enough for convergeance - couldn't S_m assume constantly changing lower or higher values within a certain interval? That certainly could satisfy the inequality above... What am I missing?
Last edited: