Abstract
We propose a novel steplength selection rule in proximal gradient methods for minimizing the sum of a differentiable function plus an ℓ 1 -norm penalty term. The proposed rule modifies one of the classical Barzilai–Borwein steplength, extending analogous results obtained in the context of gradient projection methods for constrained optimization. We analyse the spectral properties of the Barzilai–Borwein-like steplength when the differentiable part is quadratic, showing that its reciprocal lies in the spectrum of the submatrix of the Hessian that depends on both the nonzero and the nonoptimal zero components of the current iterate, allowing for acceleration effects when the optimal zero components start to be identified. Furthermore, we insert the modified rule into a proximal gradient method with a nonmonotone line search, for which we prove global convergence towards a stationary point. Numerical experiments show the ability of the proposed rule to sweep the spectrum of the reduced Hessian on a series of quadratic ℓ 1 -regularized problems, as well as its effectiveness in recovering the ground truth in a least squares regularized problem arising in image restoration.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.