Abstract

This paper proposes a new group-sparsity-inducing regularizer to approximate ℓ2,0 pseudo-norm. The regularizer is nonconvex, which can be seen as a linearly involved generalized Moreau enhancement of ℓ2,1-norm. Moreover, the overall convexity of the corresponding group-sparsity-regularized least squares problem can be achieved. The model can handle general group configurations such as weighted group sparse problems, and can be solved through a proximal splitting algorithm. Among the applications, considering that the bias of convex regularizer may lead to incorrect classification results especially for unbalanced training sets, we apply the proposed model to the (weighted) group sparse classification problem. The proposed classifier can use the label, similarity and locality information of samples. It also suppresses the bias of convex regularizer-based classifiers. Experimental results demonstrate that the proposed classifier improves the performance of convex ℓ2,1 regularizer-based methods, especially when the training data set is unbalanced. This paper enhances the potential applicability and effectiveness of using nonconvex regularizers in the frame of convex optimization.

Highlights

  • Academic Editor: Sorin-Mihai GradReceived: 10 September 2021Accepted: October 2021Published: October 2021Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.Licensee MDPI, Basel, Switzerland.In recent decades, sparse reconstruction has become an active topic in many areas, such as in fields of signal processing, statistics, and machine learning [1]

  • We show in Proposition 2 that the generalized Moreau enhancement (GME) ofw,2,1, i.e., (k · kw,2,1 ) B (see (11)), can bridge the gap betweenw,2,1 and2,0

  • As a relatively simple but typical scenario for the application of the proposed idea in this paper, we introduce the main idea of weighted group sparse classification (WGSC)

Read more

Summary

Introduction

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. Most studies in the application replace the nonconvex regularizer0 with its tightest convex envelope2,1 [16] (or its weighted variants), and the following regularized least squares problem has been proposed known as the Group LASSO [14], g minimize x ∈Rn ky − Axk22 + λ ∑ wi k xi k2 , i =1. We propose a generalized weighted group sparse estimation model based on the linearly involved generalized-Moreau-enhanced (LiGME) approach [24] that uses nonconvex regularizer while maintaining the overall convexity of the optimization problem. A preliminary short version of this paper was presented at a conference [25]

Preliminaries
LiGME Model for Group Sparse Estimation
Proposed Algorithm for Group-Sparsity Based Classification
Experiments
Initialization
Method α β
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call