Abstract

The well-known k-Nearest Neighbor classifier is a simple and flexible algorithm that has sparked wide interest in pattern classification. In spite of its straightforward implementation, the kNN is sensitive to the presence of noisy training samples and variance of the distribution. Local mean based k-nearest neighbor rule has been developed to overcome the negative effect of the noisy training sample. In this article, the local mean rule is implemented with the general nearest neighbors that are selected in a more generalized way. A new local mean based nearest neighbor classifier is proposed termed Local Mean k-General Nearest Neighbor (LMkGNN). The proposed LMkGNN classifier finds the local mean vector from general nearest neighbors of each class and classifies the test sample based on the distances between the test sample and local mean vectors. Fifteen real-world datasets from the UCI machine learning repository are used to assess and evaluate the classification performance of the proposed classifier. The performance comparison is also made with five benchmark classifiers (kNN, PNN, LMkNN, LMPNN and kGNN) in terms of the classification accuracy. Experimental results demonstrate that the proposed LMkGNN classifier performs significantly well and obtain the best classification accuracy com-pared to the five competing classifiers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.