Abstract

Among the classification algorithms in machine learning, the KNN (K nearest neighbor) algorithm is one of the most frequent used methods for its characteristics of simplicity and efficiency. Even though KNN algorithm is very effective in many situations while it still has two shortcomings, not only is the efficiency of this classification algorithm obviously affected by redundant dimensional features, but also the categorization accuracy is seriously influenced by the distribution of training samples. In this paper, we proposed a stepwise KNN algorithm based on kernel methods and attribute reduction which can effectively tackle with the problems above. We calculated the accuracy rate of the proposed algorithm and compared it with basic KNN algorithm in the experiments with use of four UCI datasets. The experiment results show that the stepwise KNN algorithm (denoted by SWKNN) performs better than the original KNN algorithm with the improvement of average 13.8% in accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call