Abstract

FaRSA is a new method for minimizing the sum of a differentiable convex function and an -norm regularizer. The main features of the method include: (i) an evolving set of indices corresponding to variables that are predicted to be nonzero at a solution; (ii) subproblems that only need to be solved in a reduced space to lessen per-iteration computational costs; (iii) conditions that determine how accurately each subproblem must be solved, which allow conjugate gradient or coordinate descent techniques to be employed; (iv) a computationally practical condition that dictates when the subspace explored by the current subproblem should be updated; and (v) a reduced proximal gradient step that ensures a sufficient decrease in the objective function when it is decided that the index set that holds the nonzero variables should be expanded. We proved global convergence of the method and demonstrated its performance on a set of model prediction problems with a Matlab implementation. Here, we introduce an enhanced subproblem termination condition that allows us to prove that the iterates converge locally at a superlinear rate. Moreover, we present the details of our publicly available C implementation along with extensive numerical comparisons to other state-of-the-art solvers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call