Abstract
Many non-linear programming algorithms employ a univariate subprocedure to determine the step length at each multivariate iteration. In recent years much work has been directed toward the development of algorithms which will exhibit favorable convergence properties on well-behaved functions without requiring that the univariate algorithm perform a sequence of one-dimensional minimizations. In this paper a direct search method (the golden section search) is modified to search for acceptable rather than minimizing step lengths and then used as the univariate subprocedure for a generalized conjugate gradient algorithm. The resulting multivariate minimization method is tested on standard unconstrained test functions and a constrained industrial problem. The new method is found to be relatively insensitive to tuning parameters (insofar as success or failure is concerned). A comparison of the golden section acceptable-point search (GSAP) with other popular acceptable-point methods indicates that GSAP is a superior strategy for use with the conjugate directions-type algorithms and is also suitable for use with the quasi-Newton methods. The comparison are based on equivalent function evaluations required to minimize multivariate test functions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.