Abstract

According to the characteristics of sparseness, poor focus and lack of semantic information in short text, the existing studies mainly improved the topic representation from two aspects: Adjusting the structure of the topic model to increase word co-occurrence and incorporating word embedding to enrich the semantic information. In this paper, we review the existing topic representation methods for short text from these two aspects, select DMM, BTM and LF-DMM as targets and compare the quality of topic representations from them. Considering the traditional topic model does not incorporate word embedding, and single short text often lacks context information due to length limitation, we try to combine it with word embedding, we use two real-world datasets to evaluate the quality of topic representation in the classification task: Although the LF-DMM model incorporates word embedding, it performs poorly on short text, and the performance of DMM and BTM integrated with word embedding improve greatly.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.