Abstract

Most topic models are constructed under the assumption that documents follow a multinomial distribution. The Poisson distribution is an alternative distribution to describe the probability of count data. For topic modelling, the Poisson distribution describes the number of occurrences of a word in documents of fixed length. The Poisson distribution has been successfully applied in text classification, but its application to topic modelling is not well documented, specifically in the context of a generative probabilistic model. Furthermore, the few Poisson topic models in the literature are admixture models, making the assumption that a document is generated from a mixture of topics. In this study, we focus on short text. Many studies have shown that the simpler assumption of a mixture model fits short text better. With mixture models, as opposed to admixture models, the generative assumption is that a document is generated from a single topic. One topic model, which makes this one-topic-per-document assumption, is the Dirichlet-multinomial mixture model. The main contributions of this work are a new Gamma-Poisson mixture model, as well as a collapsed Gibbs sampler for the model. The benefit of the collapsed Gibbs sampler derivation is that the model is able to automatically select the number of topics contained in the corpus. The results show that the Gamma-Poisson mixture model performs better than the Dirichlet-multinomial mixture model at selecting the number of topics in labelled corpora. Furthermore, the Gamma-Poisson mixture produces better topic coherence scores than the Dirichlet-multinomial mixture model, thus making it a viable option for the challenging task of topic modelling of short text.

Highlights

  • Topic modelling is a text mining technique used to uncover latent topics in large collections of documents. e Latent Dirichlet allocation (LDA) [1] model is the state-of-the-art topic model

  • E contributions of this work are as follows: (1) We propose a new topic model for short text, the Gamma-Poisson mixture (GPM) topic model, which has not been applied in the literature before. is model is based on the Poisson distribution, and we show that it is able to produce topics with improved coherence scores when compared to GSDMM [6]

  • In light of the success of Dirichlet-multinomial mixture model (DMM) on short text, the new model that we propose is a modification of DMM

Read more

Summary

Introduction

Topic modelling is a text mining technique used to uncover latent topics in large collections of documents. e Latent Dirichlet allocation (LDA) [1] model is the state-of-the-art topic model. Owing to the increasing popularity of microblogging websites, social media platforms, and online shopping (which involves product reviews), text that is significantly shorter has become increasingly relevant. Such sources of text potentially hold valuable information that can be useful in many applications, such as event tracking [2], interest profiling [3] and product recommendation [4]. DMM is inherently a mixture model; it assumes that each document contains only a single topic, which is a seemingly more sensible assumption for short text. DMM is inherently a mixture model; it assumes that each document contains only a single topic, which is a seemingly more sensible assumption for short text. is simple

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call