Abstract

Unlike the canonical k-Nearest Neighbor classifier (kNN) which treat the neighbors equally, the Fuzzy k-Nearest Neighbor (FkNN) classifier imposes a weight on each of the k nearest neighbors based on their distances from the query point, by using a fuzzy membership function. FkNN though improves the performance of kNN, requires optimizing additional data dependent parameters other than k. Furthermore, FkNN does not consider the effect of those representative features of a data point which may be noisy, redundant, and may not contain useful information to distinctly identify a specific class. We attempt to address both of these issues in the current study by proposing a Parameter Independent Fuzzy class-specific Feature Weighted k-Nearest Neighbor (PIFW-kNN) classifier. PIFW-kNN formulates the issues of choosing a suitable value of k and a set of class dependent optimum weights for the features as a single-objective continuous non-convex optimization problem. We solve this problem by using a very competitive variant of Differential Evolution (DE), called Success-History based Adaptive DE (SHADE). We perform extensive experiments to demonstrate the improved accuracy of PIFW-kNN compared to the other state-of-the-art classifiers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call