Abstract
Conventional k nearest neighbor (KNN) rule is a simple yet effective method for classification, but its classification performance is easily degraded in the case of small size training samples with existing outliers. To address this issue, A multi-average based pseudo nearest neighbor classifier (MAPNN) rule is proposed. In the proposed MAPNN rule, k ( k − 1 ) / 2 ( k > 1) local mean vectors of each class are obtained by taking the average of two points randomly from k nearest neighbors in every category, and then k pseudo nearest neighbors are chosen from k ( k − 1 ) / 2 local mean neighbors of every class to determine the category of a query point. The selected k pseudo nearest neighbors can reduce the negative impact of outliers in some degree. Extensive experiments are carried out on twenty-one numerical real data sets and four artificial data sets by comparing MAPNN to other five KNN-based methods. The experimental results demonstrate that the proposed MAPNN is effective for classification task and achieves better classification results in the small-size samples cases comparing to five relative KNN-based classifiers.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.