Abstract

Learning statistical models successfully is both an essential and a challenging task for various pattern recognition and knowledge discovery applications. In particular, generative models such as finite and infinite mixture models have demonstrated to be efficient in terms of overall performance. In this paper, a robust framework based on an expectation propagation (EP) inference is developed to learn inverted Beta-Liouville (IBL) mixture models which is proper choice for positive data classification. Within the proposed EP learning method, the full posterior distribution is estimated accurately, the model complexity and all related parameters are evaluated simultaneously in a single optimization scheme. Extensive experiments using challenging real-world applications including recognition of facial expression, automatic human action categorization, and hand gesture recognition show the merit of our approach in terms of achieving better results than comparable techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call