Abstract
K-nearest neighbor criterion has a broad research and application background in pattern recognition, machine learning, data mining and other fields. In recent years, a series of extended methods have been proposed, such as k-nearest centroid neighbor(KNCN), local mean-based k-nearest centroid neighbor (LMKNCN), etc. This is because more scholars have studied k-nearest neighbor (KNN) method. Some scholars have proposed kernel-k-nearest neighbor(kernel-KNN) method by combining the support vector machine (SVM) and KNN method. Compared with the traditional KNN method, these methods have a better classification effect. However, the classification effect of LMKNCN method is inferior to that of regular sample distribution in the case of the random sample distribution. And kernel method can effectively solve indivisible linear problems by transforming the problem into high-dimensional space. In this paper, the kernel method is combined with LMKNCN. A new method, kernel-local mean-based k-nearest centroid neighbor (kernel-LMKNCN), is obtained by nucleating the original LMKNCN methods. Experimental results show that the proposed kernel-LMKNCN methods are superior to traditional KNCN, LMKNCN and kernel-KNN methods in classification accuracy.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.