Abstract

The topic model based on latent Dirichlet allocation relies on the prior statistics of topic proportionals for multinomial words. The words in a document are modeled as a random mixture of latent topics which are drawn from a single Dirichlet prior. However, a single Dirichlet distribution may not sufficiently characterize the variations of topic proportionals estimated from the heterogeneous documents. To deal with this concern, we present a Dirichlet mixture allocation (DMA) model which learns latent topics and their proportionals for topic and document clustering by using the prior based on a Dirichlet mixture model. Multiple Dirichlets pave a way to capture the structure of latent variables in learning representation from real-world documents covering a variety of topics. This paper builds a new latent variable model and develops a variational Bayesian inference procedure to learn model parameters consisting of mixture weights, Dirichlet parameters and word multinomials. Experiments on document representation show the merit of the proposed structural learning by increasing the number of Dirichlets in a DMA topic model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.