- #1
DivGradCurl
- 372
- 0
Problem
Newton's method gives an approximation to a root [tex] r [/tex] of the equation [tex] f(x)=0 [/tex]. From an initial approximation [tex] x_1 [/tex] we obtain successive approximations [tex] x_2 , x_3 , \ldots ,[/tex] where
[tex] x_{n+1} = x_n - \frac{f(x_n)}{f^{\prime}(x_n)} [/tex]
Use Taylor's Inequality with [tex] n=1 [/tex], [tex] a=x_n [/tex], and [tex] x=r [/tex] to show that if [tex] f^{\prime \prime} (x) [/tex] exists on an interval [tex] I [/tex] containing [tex] r [/tex], [tex] x_n [/tex], and [tex] x_{n+1} [/tex], and [tex] \left| f^{\prime \prime} (x) \right| \leq M [/tex], [tex] \left| f^{\prime} (x) \right| \geq K [/tex] for all [tex] x \in I [/tex], then
[tex] \left| x_{n+1} - r \right| \leq \frac{M}{2K}\left| x_n - r \right| ^2 [/tex]
[This means that if [tex] x_n [/tex] is accurate to [tex] d [/tex] decimal places, then [tex] x_{n+1} [/tex] is accurate to about [tex] 2d [/tex] decimal places. More precisely, if the error at stage [tex] n [/tex] is at most [tex] 10^{-m} [/tex], then the error at stage [tex] n+1 [/tex] is at most [tex] \left( \frac{M}{2K} \right) 10^{-2m} [/tex] .]
Comments
I'm acquainted with Newton's method
[tex] x_{n+1} = x_n - \frac{f(x_n)}{f^{\prime}(x_n)} [/tex]
and Taylor's Inequality
[tex] \left| R_n (x) \right| \leq \frac{M}{\left( n+1 \right) !} \left| x-a \right| ^{n+1} \qquad \mbox{ for } \left| x-a \right| \leq d [/tex]
However, it's not clear to me how I can join those concepts. Here is what I tried to do (without success ):
If
[tex] \int _{x_n} ^{r} f^{\prime \prime} (t) \, dt \leq \int _{x_n} ^{r} M \, dt [/tex]
[tex] f^{\prime} (r) - f^{\prime} (x_n) \leq M (r-x_n) [/tex]
[tex] f^{\prime} (r) \leq f^{\prime} (x_n) + M (r-x_n) [/tex]
Then
[tex] \int _{x_n} ^{r} K \, dt \leq \int _{x_n} ^{r} f^{\prime} (t) \, dt \leq \int _{x_n} ^{r} \left[ f^{\prime} (x_n) + M (t-x_n) \right] \, dt [/tex]
[tex] K (r-x_n) \leq f (r) - f(x_n) \leq f^{\prime}(x_n)(r-x_n) + \frac{M}{2} (r-x_n) ^2 [/tex]
I just need some tips. I probably am in the wrong direction.
Thank you
Newton's method gives an approximation to a root [tex] r [/tex] of the equation [tex] f(x)=0 [/tex]. From an initial approximation [tex] x_1 [/tex] we obtain successive approximations [tex] x_2 , x_3 , \ldots ,[/tex] where
[tex] x_{n+1} = x_n - \frac{f(x_n)}{f^{\prime}(x_n)} [/tex]
Use Taylor's Inequality with [tex] n=1 [/tex], [tex] a=x_n [/tex], and [tex] x=r [/tex] to show that if [tex] f^{\prime \prime} (x) [/tex] exists on an interval [tex] I [/tex] containing [tex] r [/tex], [tex] x_n [/tex], and [tex] x_{n+1} [/tex], and [tex] \left| f^{\prime \prime} (x) \right| \leq M [/tex], [tex] \left| f^{\prime} (x) \right| \geq K [/tex] for all [tex] x \in I [/tex], then
[tex] \left| x_{n+1} - r \right| \leq \frac{M}{2K}\left| x_n - r \right| ^2 [/tex]
[This means that if [tex] x_n [/tex] is accurate to [tex] d [/tex] decimal places, then [tex] x_{n+1} [/tex] is accurate to about [tex] 2d [/tex] decimal places. More precisely, if the error at stage [tex] n [/tex] is at most [tex] 10^{-m} [/tex], then the error at stage [tex] n+1 [/tex] is at most [tex] \left( \frac{M}{2K} \right) 10^{-2m} [/tex] .]
Comments
I'm acquainted with Newton's method
[tex] x_{n+1} = x_n - \frac{f(x_n)}{f^{\prime}(x_n)} [/tex]
and Taylor's Inequality
[tex] \left| R_n (x) \right| \leq \frac{M}{\left( n+1 \right) !} \left| x-a \right| ^{n+1} \qquad \mbox{ for } \left| x-a \right| \leq d [/tex]
However, it's not clear to me how I can join those concepts. Here is what I tried to do (without success ):
If
[tex] \int _{x_n} ^{r} f^{\prime \prime} (t) \, dt \leq \int _{x_n} ^{r} M \, dt [/tex]
[tex] f^{\prime} (r) - f^{\prime} (x_n) \leq M (r-x_n) [/tex]
[tex] f^{\prime} (r) \leq f^{\prime} (x_n) + M (r-x_n) [/tex]
Then
[tex] \int _{x_n} ^{r} K \, dt \leq \int _{x_n} ^{r} f^{\prime} (t) \, dt \leq \int _{x_n} ^{r} \left[ f^{\prime} (x_n) + M (t-x_n) \right] \, dt [/tex]
[tex] K (r-x_n) \leq f (r) - f(x_n) \leq f^{\prime}(x_n)(r-x_n) + \frac{M}{2} (r-x_n) ^2 [/tex]
I just need some tips. I probably am in the wrong direction.
Thank you