Abstract

In the framework of large-scale optimization problems, the standard BFGS method is not affordable due to memory constraints. The so-called limited-memory BFGS (L-BFGS) method is an adaption of the BFGS method for large-scale settings. However, the standard BFGS method and therefore the standard L-BFGS method only use the gradient information of the objective function and neglect function values. In this paper, we propose a new regularized L-BFGS method for solving large scale unconstrained optimization problems in which more available information from the function and gradient values are employed to approximate the curvature of the objective function. The proposed method utilizes a class of modified quasi-Newton equations in order to achieve higher order accuracy in approximating the second order curvature of the objective function. Under some standard assumptions, we provide the global convergence property of the new method. In order to provide an efficient method for finding global minima of a continuously differentiable function, a hybrid algorithm that combines a genetic algorithm (GA) with the new proposed regularized L-BFGS method has been proposed. This combination leads the iterates to a stationary point of the objective function with higher chance of being global minima. Numerical results show the efficiency and robustness of the new proposed regularized L-BFGS and its hybridized version with GA in practice.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.