Abstract

This paper deals with supervised classification and feature selection with application in the context of high dimensional features. A classical approach leads to an optimization problem minimizing the within sum of squares in the clusters (I2 norm) with an I1 penalty in order to promote sparsity. It has been known for decades that I1 norm is more robust than I2 norm to outliers. In this paper, we deal with this issue using a new proximal splitting method for the minimization of a criterion using I2 norm both for the constraint and the loss function. Since the I1 criterion is only convex and not gradient Lipschitz, we advocate the use of a Douglas-Rachford minimization solution. We take advantage of the particular form of the cost and, using a change of variable, we provide a new efficient tailored primal Douglas-Rachford splitting algorithm which is very effective on high dimensional dataset. We also provide an efficient classifier in the projected space based on medoid modeling. Experiments on two biological datasets and a computer vision dataset show that our method significantly improves the results compared to those obtained using a quadratic loss function.

Highlights

  • In this paper we consider methods in which feature selection is embedded in a classification process [23, 25]

  • Let X be the nonzero m × d matrix made of m line samples x1, . . . , xm belonging to the d-dimensional space of features

  • Computational times seem linear as a function of the features d and quadratic as a function of samples for both algorithms

Read more

Summary

Introduction

In this paper we consider methods in which feature selection is embedded in a classification process [23, 25]. We propose to minimize an 1 norm both on the penalty term and the loss function. In this case, the criterion is convex but not gradient Lipschitz. We propose to use the Douglas-Rachford splitting method for the minimization of our criterion. This splitting was successfully used in signal processing [4, 9, 10, 11, 20, 39, 41].

A robust framework
An equivalent formulation
Douglas-Rachford splitting
Classification using medoid
11: Output
Application to real datasets
Findings
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.