Although there exist a lot of k-nearest neighbor approaches and their variants, few of them consider how to make use of the information in both the whole feature space and subspaces. In order to address this limitation, we propose a new classifier named as the random subspace evidence classifier (RSEC). Specifically, RSEC first calculates the local hyperplane distance for each class as the evidences not only in the whole feature space, but also in randomly generated feature subspaces. Then, the basic belief assignment is computed according to these distances for the evidences of each class. In the following, all the evidences represented by basic belief assignments are pooled together by the Dempster's rule. Finally, RSEC assigns the class label to each test sample based on the combined belief assignment. The experiments in the datasets from UCI machine learning repository, artificial data and face image database illustrate that the proposed approach yields lower classification error in average comparing to 7 existing k-nearest neighbor approaches and variants when performing the classification task. In addition, RSEC has good performance in average on the high dimensional data and the minority class of the imbalanced data.
Read full abstract