Abstract

Proximal algorithms are popular class of methods for handling sparsity structure in the datasets due to their low iteration costs and faster convergence. In this paper, we consider the framework of the sum of two convex functions, one of which is a smooth function with a Lipschitz gradient, while the other may be a non-smooth function. The usages of such non-smooth functions for identifying complex sparsity-structures in datasets in form of non-smooth regularizers has been an active research direction in the recent past. In this paper, we present the convergence analysis for the extragradient-based fixed-point method with an inertial component, based on which recently a new accelerated proximal extragradient algorithm is designed. In addition, extending the application areas of this algorithm, we applied it to solve (i) the logistic regression problem with complex ℓ1-based penalties, namely, overlapping group lasso and fused lasso frameworks, and (ii) a recently proposed structurally-regularized learning problem for representation selection where the objective function consists of a reconstruction error and structured regularizers as combination of group sparsity regularizer, diversity regularizer, and locality-sensitivity regularizer. With the help of extensive experiments on several publicly available real-world datasets, the efficacy of the inertial-based extragradient methods has been demonstrated for solving the extended lasso and representation selection problems of machine learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call