Abstract

Although there exist a lot of k-nearest neighbor approaches and their variants, few of them consider how to make use of the information in both the whole feature space and subspaces. In order to address this limitation, we propose a new classifier named as the random subspace evidence classifier (RSEC). Specifically, RSEC first calculates the local hyperplane distance for each class as the evidences not only in the whole feature space, but also in randomly generated feature subspaces. Then, the basic belief assignment is computed according to these distances for the evidences of each class. In the following, all the evidences represented by basic belief assignments are pooled together by the Dempster's rule. Finally, RSEC assigns the class label to each test sample based on the combined belief assignment. The experiments in the datasets from UCI machine learning repository, artificial data and face image database illustrate that the proposed approach yields lower classification error in average comparing to 7 existing k-nearest neighbor approaches and variants when performing the classification task. In addition, RSEC has good performance in average on the high dimensional data and the minority class of the imbalanced data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.