Abstract

We consider a class of nonsmooth convex optimization problems where the objective function is a convex differentiable function regularized by the sum of the group reproducing kernel norm and $$\ell _1$$l1-norm of the problem variables. This class of problems has many applications in variable selections such as the group LASSO and sparse group LASSO. In this paper, we propose a proximal Landweber Newton method for this class of convex optimization problems, and carry out the convergence and computational complexity analysis for this method. Theoretical analysis and numerical results show that the proposed algorithm is promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call