Abstract

In machine learning, most models can be transformed into unconstrained optimization problems, so how to solve the unconstrained optimization problem for different objective functions is always a hot issue. In this paper, a class of unconstrained optimization where objection function has [Formula: see text]th-order derivative and Lipschitz continuous simultaneously is studied. To handle such problems, we propose an accelerated regularized Chebyshev–Halley method based on the Accelerated Hybrid Proximal Extragradient (A-HPE) framework. It proves that convergence complexity of the proposed method is [Formula: see text], which is consistent with the lower iteration complexity bound for third-order tensor methods. Numerical experiments on functions in machine learning demonstrate the promising performance of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call