Abstract

Topic models are widely used in various fields of machine learning and statistics. Among them, the dynamic topic model (DTM) is the most popular time-series topic model for the dynamic representations of text corpora. A major challenge is that the posterior distribution of DTM requires a complex reasoning process with the high cost of computing time in modeling, and even a tiny change of model requires restructuring. For these reasons, the variability and generality of DTM is so poor that DTM is difficult to be carried out. In this paper, we introduce a new method for constructing DTM based on variational autoencoder and factor graphs. This model uses re-parameterization of the variational lower bound to generate a lower bound estimator which is optimized by standard stochastic gradient descent method directly. At the same time, the optimization process is simplified by integrating the dynamic factor graph in the state space to achieve a better model. The experimental dataset uses a journal paper corpus that mainly focuses on natural language processing and spans twenty-five years (1984–2009) from DBLP. Experiment results indicate that the proposed method is effective and feasible by comparing several state-of-the-art baselines.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.