Abstract

Chinese poetry generation has been a challenging part of natural language processing due to the unique literariness and aesthetics of poetry. In most cases, the content of poetry is topic related. In other words, specific thoughts or emotions are usually expressed regarding given topics. However, topic information is rarely taken into consideration in current studies about poetry generation models. In this article, we propose a topic-enhanced Chinese poetry generation model called TPoet in which the topic model is integrated into the Transformer-based auto-regressive text generation model. By feeding topic information to the input layer and heterogeneous attention mechanism, TPoet can implicitly learn the latent information of topic distribution. In addition, by setting multiple identifiers such as segment, rhyme, and tone, the model can explicitly learn the constraints of generated poems. Extensive experimental results show that the quality of TPoet-generated poems outperforms the current advanced models or systems, and the topic consistency and diversity in generated poems have been significantly improved as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call