Is There a More Efficient Version of Newton's Method Using Higher Derivatives?

In summary, there are variations of Newton's method that use higher derivatives, such as Halley's method. Another method is to use an interpolation of the function's inverse to accurately estimate the zero point. However, these methods may not always be worth the additional work compared to the basic Newton's method.
  • #1
defunc
55
0
Are there any variations of Newtons method, say where you use higher derivatives?
 
Mathematics news on Phys.org
  • #2
I don't know of any that are commonly used but it's not hard to extend the basic idea.

In "Newton' method", we approximate the function by its tangent line at a given point, and determine where that line crosses y= 0.

We could, as well, approximate the function by a parabola that (1) passes through the given point, (2) has the same slope their, and (3) has the same second derivative.

That is, to solve f(x)= 0, start with some given point [itex](x_0, f(x_0))[/itex], at which f(x) has derivative [itex]f'(x_0)[/itex] and second derivative [itex]f''(x_0)[/itex].

We can write a parabola through [itex](x_0, f(x_0))[/itex] as [itex]y= a(x- x_0)^2+ b(x- x_0)+ f(x_0)[/itex]. That has derivative, at [itex]x= x_0[/itex], [itex]y'= b[/itex] and second dervative [itex]y''= 2a[/itex] so that the "approximating parabola" is [itex]f''(x_0)(x- x_0)^2/2+ f'(x_0)(x- x_0)+ f(x_0)[/itex]. Set that equal to 0 and solve for x to get the next x at which to approximate.

But the fact is that there are just some "approximating" methods that are just so good, any improvement by a "better" approximation just isn't worth the additional work. Newton's method is one of those!
 
  • #3
As usually presented Newton's method is intended for a single unknown variable.
Hamming gives an improved method, sensitive to the difference between the current approximation and the last.

There is an equivalent method when you have a system of equations usng the Jacobian in place of the first derivative.
 
Last edited:
  • #4
HallsofIvy said:
We could, as well, approximate the function by a parabola that (1) passes through the given point, (2) has the same slope their, and (3) has the same second derivative.

Isnt this method the basic idea behind Taylor Polynomials?
 
  • #5
defunc said:
Are there any variations of Newtons method, say where you use higher derivatives?

Yep, Halley's method uses 2nd derivatives in addition to first derivatives. If Newton's method converges quickly, Halley's method is turbo charged.

http://en.wikipedia.org/wiki/Halley's_method
 
  • #6
The following method is often used when the evaluating the function is very time consuming. You first find a few starting values. E.g. if y(x) is your finction, you find two points x1 and x2 such that y changes siign and then you do, say, two bisection steps, ending up with four points. Then you do an interpolation using the four points, but not of y asa function of x, but instead of x as a function of y. In that interpolating polynoimial, x(y) you simply put y equal to zero and you obtain a very accurate estimate of the zero of y.

You then throw away the first point and construct a new interpolating polynomial using the last four points, insert y = 0 in there and then iterate this process.
 

Related to Is There a More Efficient Version of Newton's Method Using Higher Derivatives?

What is Newton's method?

Newton's method is an iterative algorithm used to approximate the roots of a function. It is based on the idea of using tangent lines to approximate the roots of a function.

What are the variations of Newton's method?

Some variations of Newton's method include the modified Newton's method, the quasi-Newton method, and the damped Newton's method. These variations aim to improve the convergence and stability of the original Newton's method.

What is the modified Newton's method?

The modified Newton's method, also known as the secant method, uses secant lines instead of tangent lines to approximate the roots of a function. This can sometimes improve the convergence rate of the method.

What is the quasi-Newton method?

The quasi-Newton method is a generalization of Newton's method that uses an approximation of the Hessian matrix instead of calculating it directly. This can be more computationally efficient for functions with a high number of variables.

What is the damped Newton's method?

The damped Newton's method adds a damping factor to the Newton's method iteration in order to improve the stability of the algorithm. This can prevent overshooting and oscillations that may occur in the original Newton's method.

Similar threads

Replies
16
Views
2K
  • General Math
Replies
7
Views
2K
  • General Math
Replies
1
Views
823
Replies
1
Views
10K
  • General Math
Replies
5
Views
903
Replies
8
Views
2K
  • General Math
Replies
9
Views
1K
  • General Math
Replies
11
Views
2K
Replies
7
Views
1K
Back
Top