Abstract
K-nearest neighbor rule (KNN) is the well-known non-parametric technique in the statistical pattern classification, owing to its simplicity, intuitiveness and effectiveness. In this paper, we firstly review the related works in brief and detailedly analyze the sensitivity issue on the choice of the neighborhood size k , existed in the KNN rule. Motivated by the problem, a novel dual weighted voting scheme for KNN is developed. With the goal of overcoming the sensitivity of the choice of the neighborhood size k and improving the classification performance, the proposed classifier mainly employs the dual weighted voting function to reduce the effect of the outliers in the k nearest neighbors of each query object. To verify the superiority of the proposed classifier, the experiments are conducted on one artificial data set and twelve real data sets, in comparison with the other classifiers. Experimental results suggest that our proposed classifier is an effective algorithm for the classification tasks in many practical situations, owing to its satisfactory classification performance and robustness over a wide range of k .
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have