Abstract

In most pattern recognition (PR) applications, it is advantageous if the accuracy (or error rate) of the classifier can be evaluated or bounded prior to testing it in a real-life setting. It is also well known that if the two class-conditional distributions have a large overlapping volume (almost all the available work on “overlapping of classes” deals with the case when there are only two classes), the classification accuracy is poor. This is because if we intend to use the classification accuracy as a criterion for evaluating a PR system, the points within the overlapping volume tend to lead to maximal misclassification. Unfortunately, the computation of the indices which quantify the overlapping volume is expensive. In this vein, we propose a strategy of using a prototype reduction scheme (PRS) to approximately, but quickly, compute the latter. In this paper, we demonstrate, first of all, that this is an extremely expedient proposition. Indeed, we show that by completely discarding (we are not aware of any reported scheme which discards “irrelevant” sample (training) points, and which simultaneously attains to an almost-comparable accuracy) the points not included by the PRS, we can obtain a reduced set of sample points, using which, in turn, the measures for the overlapping volume can be computed. The value of the corresponding figures is comparable to those obtained with the original training set (i.e., the one which considers all the data points) even though the computations required to obtain the prototypes and the corresponding measures are significantly less. The proposed method has been rigorously tested on artificial and real-life datasets, and the results obtained are, in our opinion, quite impressive—sometimes faster by two orders of magnitude.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.