Abstract

Standard implementations of non-parametric classifiers have large computational requirements. Parzen classifiers use the distances of an unknown vector to all N prototype samples, and consequently exhibit O( N) behavior in both memory and time. We describe four techniques for expediting the nearest neighbor methods: replacing the linear search with a new kd tree method, exhibiting approximately O(N 1 2 ) behavior; employing an L ∞ instead of L 2 distance metric; using variance-ordered features; and rejecting prototypes by evaluating distances in low dimensionality subspaces. We demonstrate that variance-ordered features yield significant efficiency gains over the same features linearly transformed to have uniform variance. We give results for a large OCR problem, but note that the techniques expedite recognition for arbitrary applications. Three of four techniques preserve recognition accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.