Abstract
We consider a class of nonconvex regularized optimization problems, which appear frequently in machine learning and data processing. Due to the structure of the problems, the iteratively reweighted algorithm was developed and applied to the consensus optimization. In this paper, we propose the acceleration of this scheme by adding an inertial term in each iteration. The proposed algorithms inherit the advantages of classical decentralized algorithms: they can be implemented over a connected network, in which the agents communicate with their neighbors and perform local computations. We also employ the diminishing stepsizes technique for the iteratively reweighted algorithm and consider its acceleration. In specific cases, our algorithms reduce to existing decentralized schemes and also indicate novel ones. Mathematically, we prove the convergence for both algorithms with several assumptions on the objective functions. With Kurdyka-Łojasiewicz property, convergence rates can be derived for constant stepsize case. Numerical results demonstrate the efficiency of the algorithms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Knowledge and Data Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.