Abstract

In this work, a new advanced active set limited memory BFGS (Broyden–Fletcher–Goldfarb–Shanno) algorithm is proposed for efficiently training weight-constrained neural networks, called AA-L-BFGS. The proposed algorithm possesses the significant property of approximating the curvature of the error function with high-order accuracy by utilizing the theoretically advanced secant condition proposed by Livieris and Pintelas (Appl Math Comput 221:491–502, 2013). Moreover, the global convergence of the proposed algorithm is established provided that the line search satisfies the modified Armijo condition. The presented numerical experiments illustrate the efficiency of the proposed AA-L-BFGS, providing empirical evidence that it significantly accelerates the convergence of the training process.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call