Abstract

Despite the widespread acceptance of mixture models among researchers, obtaining good results involves several challenges. In this paper, we address two main challenges: parameter estimation and data representation strategies.Expectation-Maximization (EM) is a widely used framework for parameter estimation. However, many factors complicate the process through intractable calculations of the posterior distribution and the parameters. Minorization-Maximization (MM) is an alternative framework that relaxes the complications and requirements of the EM. This paper adopts the MM framework for the Multinomial Nested Dirichlet Mixture in a hierarchical manner. The hierarchical nature of the MNDM is exploited through a Hierarchical Feature Learning framework (HFL), where the data represented is a result of the well-known Spatial Pyramid Matching method. Moreover, the mixture's components are determined by the Minimum Message Length (MML). Therefore, this paper presents an HFL framework for the data representation of the MNDM, where its learning is based on the MM framework, and its selection is based on MML. The validation of the two addressed improvements is proven through three visual datasets using recall and precision performance metrics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.