Abstract

Many existing Chinese poetry generation models are based on Recurrent Neural Network Language Model. The Chinese poetry generated by these methods have serious defects in rhythm and fluency. A method of using Unfied Language Model Pre-training for Natural Language Understanding and Generation(UniLM) to generate Chinese poetry is proposed. The model is based on Transformer network. UniLM not only has the same good natural language understanding ability as BERT, but also has the ability of natural language generation. It is a pre-training language model that can generate and understand. This paper uses UniLM to realize the automatic generation of Chinese poetry, through the experiment on Chinese poetry data set, it can show that the Chinese poetry generation method based on UniLM is better than the proposed ones. It has significantly improve in fluency, rhythm and poetry.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call