Abstract

Among classic data mining algorithms, the K-Nearest Neighbor (KNN)-based methods are effective and straightforward solutions for the classification tasks. However, most KNN-based methods do not fully consider the impact across different training samples in classification tasks, which leads to performance decline. To address this issue, we propose a method named Attention-based Local Mean K-Nearest Centroid Neighbor Classifier (ALMKNCN), bridging the nearest centroid neighbor computing with the attention mechanism, which fully considers the influence of each training query sample. Specifically, we first calculate the local centroids of each class with the given query pattern. Then, our ALMKNCN introduces the attention mechanism to calculate the weight of pseudo-distance between the test sample to each class centroid. Finally, based on attention coefficient, the distances between the query sample and local mean vectors are weighted to predict the classes for query samples. Extensive experiments are carried out on real data sets and synthetic data sets by comparing ALMKNCN with the state-of-art KNN-based methods. The experimental results demonstrate that our proposed ALMKNCN outperforms the compared methods with large margins.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.