Abstract

Local-learning-based feature selection has been successfully applied to high-dimensional data analysis. It utilizes class labels to define a margin for each data sample and selects the most discriminative features by maximizing the margins with regard to a feature weight vector. However, it requires that all data samples are labeled, which makes it unsuitable for semi-supervised learning where only a handful of training samples are labeled while most are unlabeled. To address this issue, we herein propose a new semi-supervised local-learning-based feature selection method. The basic idea is to learn the class labels of unlabeled samples in a new feature subspace induced by the learned feature weights, and then use the learned class labels to define the margins for feature weight learning. By constructing and optimizing a unified objective function, the feature weights and class labels are learned simultaneously in an iterative algorithm. The experiments performed on some benchmark data sets show the advantage of the proposed algorithm over stat-of-the-art semi-supervised feature selection methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call