Abstract

Abstract The nearest neighbor classifier (NNC) is a popular non-parametric classifier. It is a simple classifier with no design phase and shows good performance. Important factors affecting the efficiency and performance of NNC are (i) memory required to store the training set, (ii) classification time required to search the nearest neighbor of a given test pattern, and (iii) due to the curse of dimensionality it becomes severely biased when the dimensionality of the data is high with finite samples. In this paper we propose (i) a novel pattern synthesis technique to increase the density of patterns in the input feature space which can reduce the curse of dimensionality effect, (ii) a compact representation of the training set to reduce the memory requirement, (iii) a weak approximate nearest neighbor classifier which has constant classification time, and (iv) an ensemble of the approximate nearest neighbor classifiers where the individual classifier's decisions are combined based on the majority vote. The ensemble has constant classification time upperbound and according to empirical results, it shows good classification accuracy. A comparison based on empirical results is shown between our approaches and other related classifiers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.