Abstract

There is increasing interest in the application of nonlinear optimization techniques across a broad range of real-world problems that is linked to rapid advances in novel technologies and renewed interest in areas like sustainable development. While many algorithms for nonlinear optimization have been developed, they differ widely in their methods of convergence to the solution. Consequently, matching the function to be optimized with the right optimization algorithm is essential. To further explore this relationship, we studied two nonlinear optimization methods – the Nelder-Mead simplex method that only performs function evaluations and the quasi-Newton method which requires estimation of derivatives. These algorithms were used to find the global minima of the Rosenbrock, Booth, and Matyas functions from multiple starting points. The convergence paths for each starting point across both optimization methods for all 3 functions were visualized on contour plots. While convergence on the global minima was observed in all instances, our analysis indicated that the quasi-Newton method was consistently more efficient and needed fewer iterations than the Simplex method. This was especially pronounced for the Booth and Matyas functions where ~10-fold and 20-fold fewer iterations, respectively, were necessary for convergence. Our analysis reinforces the need to carefully match properties of the function being minimized with the performance characteristics of the optimization approach to obtain fast convergence on the global minimum.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call