Abstract
Unlike the canonical k-Nearest Neighbor classifier (kNN) which treat the neighbors equally, the Fuzzy k-Nearest Neighbor (FkNN) classifier imposes a weight on each of the k nearest neighbors based on their distances from the query point, by using a fuzzy membership function. FkNN though improves the performance of kNN, requires optimizing additional data dependent parameters other than k. Furthermore, FkNN does not consider the effect of those representative features of a data point which may be noisy, redundant, and may not contain useful information to distinctly identify a specific class. We attempt to address both of these issues in the current study by proposing a Parameter Independent Fuzzy class-specific Feature Weighted k-Nearest Neighbor (PIFW-kNN) classifier. PIFW-kNN formulates the issues of choosing a suitable value of k and a set of class dependent optimum weights for the features as a single-objective continuous non-convex optimization problem. We solve this problem by using a very competitive variant of Differential Evolution (DE), called Success-History based Adaptive DE (SHADE). We perform extensive experiments to demonstrate the improved accuracy of PIFW-kNN compared to the other state-of-the-art classifiers.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.