Abstract

The k-nearest neighbor (KNN) rule is a simple and effective nonparametric classification algorithm in pattern classification. However, it suffers from several problems such as sensitivity to outliers and inaccurate classification decision rule. Thus, a local mean-based k-nearest neighbor classifier (LMKNN) was proposed to address these problems, which assigns the query sample with a class label based on the closest local mean vector among all classes. It is proven that the LMKNN classifier achieves better classification performance and is more robust to outliers than the classical KNN classifier. Nonetheless, the unreliable nearest neighbor selection rule and single local mean vector strategy in LMKNN classifier severely have negative effect on its classification performance. Considering these problems in LMKNN, we propose a globally adaptive k-nearest neighbor classifier based on local mean optimization, which utilizes the globally adaptive nearest neighbor selection strategy and the implementation of local mean optimization to obtain more convincing and reliable local mean vectors. The corresponding experimental results conducted on twenty real-world datasets demonstrated that the proposed classifier achieves better classification performance and is less sensitive to the neighborhood size $$k$$ compared with other improved KNN-based classification methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call