Abstract

Dimensionality reduction is often recommended to handle high dimensional data before performing the tasks of visualization and classification. So far, large families of dimensionality reduction methods besides the supervised or the unsupervised, the linear or the nonlinear, the global or the local have been developed. In this paper, a maximum nonparametric margin projection (MNMP) method is put forward to extract features from original high dimensional data. In the proposed method, we offer some nonparametric or local definitions to the traditional between-class scatter and within-class scatter, which contributes to remove the disadvantage that linear discriminant analysis (LDA) can not be well-performed in the cases of non-Gaussian distribution data. Based on the predefined between-class scatter and the within-class scatter, a nonparametric margin can be reasoned to avoid the small sample size (SSS) problem. Moreover, the proposed nonparametric margin will be maximized to explore a discriminant subspace. At last, we have conducted experiments on some benchmark data sets such as Palmprint database, AR face database and Yale face database. In addition, performance comparisons have also been made to some related feature extraction methods including LDA, nonparametric discriminant analysis (NDA) and local graph embedding based on maximum margin criterion (LGE/MMC). Experimental results on these data sets have validated that the proposed algorithm is effective and feasible.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call