Abstract

Complex pattern recognition problems are not usually solved by a single classifier. Multiple classifiers are used instead. Typically, the classifiers are arranged hierarchically, such that the low level classifier (LLC) produces not only a single decision of its best guess of the class of the input pattern but a list of choices ranked according to their likelihood. The high level classifier (HLC) then chooses from this set of classes using additional information that is not usually available to or well represented in a single LLC, such as knowledge of the context or the model. Training neural networks (NNs) as low level classifiers has been traditionally performed independent of what the HLC may do. The traditional performance measure for evaluating classifiers is the classification counting function which counts the number of correct classifications performed by that classifier. It is of course desired that the LLC produces the correct classification (by ranking the correct class as the top choice). Moreover, it is preferred that the LLC ranks the correct class as the second choice if it is not able to correctly classify it (as the first choice). A new cost function (which accounts for the correctness of class rankings) is presented. When this cost function is optimized, it will achieve this desired ranking performance. The parameters of this new cost function will be linked to statistical parameters of our proposed hierarchical model. Unfortunately, this cost function cannot be used to train neural networks because it is not differentiable. Thus we investigate differentiable approximations that are well suited to training NNs using the backpropagation algorithm. Initial simulation results show that superiority of this new error measure over the traditional mean square error measure both in terms of classification and ranking performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.