Abstract

In the development of algorithms for convex optimization problems, symmetry plays a very important role in the approximation of solutions in various real-world problems. In this paper, based on a fixed point algorithm with the inertial technique, we proposed and study a new accelerated algorithm for solving a convex bilevel optimization problem for which the inner level is the sum of smooth and nonsmooth convex functions and the outer level is a minimization of a smooth and strongly convex function over the set of solutions of the inner level. Then, we prove its strong convergence theorem under some conditions. As an application, we apply our proposed algorithm as a machine learning algorithm for solving some data classification problems. We also present some numerical experiments showing that our proposed algorithm has a better performance than the five other algorithms in the literature, namely BiG-SAM, iBiG-SAM, aiBiG-SAM, miBiG-SAM and amiBiG-SAM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.