Abstract

In this paper, we describe a Bayesian classification method that informatively combines diverse sources of information and multiple feature spaces for multiclass problems. The proposed method is based on recent advances in kernel approaches where the integration of multiple object descriptors, or feature spaces, is achieved via kernel combination. Each kernel constructs a similarity metric between objects in a particular feature space and then having a common metric across modalities an overall combination can be constructed. We follow a hierarchical Bayesian approach, which introduces prior distributions over random variables and we construct a Gibbs sampling Markov chain Monte Carlo (MCMC) solution which is naturally derived from the employed multinomial probit likelihood. The methodology is the basis for possible deterministic approximations such as variational or maximum-a-posteriori estimators, and it is compared against the well-known classifier combination methods on the classification of handwritten numerals. The results of the proposed method show a significant improvement over the best individual classifier and match the performance of the best multiple classifier combination, whilst reducing the computational requirements of combining classifiers and offering additional information on the significance of the contributing sources.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.