Abstract

Nearest Neighbor (NN) rule is one of the simplest and most important methods in pattern recognition. In this paper, we propose a kernel difference-weighted k-nearest neighbor method (KDF-WKNN) for pattern classification. The proposed method defines the weighted KNN rule as a constrained optimization problem, and then we propose an efficient solution to compute the weights of different nearest neighbors. Unlike distance-weighted KNN which assigns different weights to the nearest neighbors according to the distance to the unclassified sample, KDF-WKNN weights the nearest neighbors by using both the norm and correlation of the differences between the unclassified sample and its nearest neighbors. Our experimental results indicate that KDF-WKNN is better than the original KNN and distance-weighted KNN, and is comparable to some state-of-the-art methods in terms of classification accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.