Newton's Method for Optimization

In summary, the conversation is discussing the efficiency of Newton's method in high dimensions and its ability to converge to a min/max or saddle point quickly. The speaker is having trouble getting the value of their gradient below 12-16 and is considering using Fletcher-Reeves method instead. They also mention the challenges of multivariable optimization and the need for a good understanding of the function being optimized.
  • #1
brydustin
205
0
Just curious if Newton's method in high dimensions should always quickly converge to a min/max or saddle point. I can't seem to get the value of my gradient below 12-16; so, its not "diverging" but its not converging either. I want to avoid saddle points so I'm using Fletcher-Reeves method, but I figure if I test it with Newton-Raphson then it should at least converge to a saddle point quickly, right? (assuming my initial starting point is "good" in some sense).

thanks all
 
Physics news on Phys.org
  • #2
could you please given an example? What is the function for which you cannot get the value of the gradient below 12-16?

Even in one dimension it is possible to get a function and intitial value where you just alternate between two x-values- but just a slight change in initial value will correct that.
 
  • #3
The basic problem for any multivariable optimisation algorithm is the fact that (unlike the 1-D case) the optimum is usually at the bottom of a "curved valley" in N dimensions. Any method that only looks at the local gradient at a point tends to follow the steepest gradient, and the route to the optimum tends to be many short steps that "bounce off the sides" of the valley rather than "following the curve" along the bottom. A classic example of this for 2 variables (where it is fairly easy to draw pictures to see what is happening) is "Rosenbrock's banana function" (see Google).

To do multi-variable optimization efficiently, you usually need to know a lot about the function you are trying to optimize.
 

Related to Newton's Method for Optimization

What is Newton's Method for Optimization?

Newton's method for optimization is a mathematical algorithm used to find the minimum or maximum value of a function. It is also known as the Newton-Raphson method and is commonly used in calculus and optimization problems.

How does Newton's Method for Optimization work?

The method starts with an initial guess for the optimal value and then uses a series of iterations to find a more accurate estimate. It involves finding the roots of the derivative of the function, which represent the points where the function is changing the fastest. By repeatedly updating the initial guess, the algorithm eventually converges to the optimal value.

What are the advantages of using Newton's Method for Optimization?

One of the main advantages of Newton's method is its speed of convergence. It can converge to the optimal value in a few iterations, making it more efficient than other optimization methods. It is also versatile and can be applied to a wide range of functions.

What are the limitations of Newton's Method for Optimization?

Newton's method may fail to converge in some cases, such as when the initial guess is far from the optimal value or when the function has multiple local minima or maxima. It also requires the function to be differentiable, which may not always be the case.

How is Newton's Method for Optimization used in real-world applications?

Newton's method has many real-world applications, including in engineering, economics, and physics. It is commonly used in the design and optimization of structures, processes, and systems. It is also used in finance to optimize investment portfolios and in machine learning for training models.

Similar threads

  • Calculus
Replies
13
Views
1K
Replies
3
Views
1K
  • General Math
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
905
  • General Math
Replies
5
Views
903
  • Calculus
Replies
5
Views
4K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
1K
Replies
7
Views
1K
Back
Top