Abstract

SummaryQuestion generation is a promising and important area in natural language processing. According to the information of the given text, the question generation model can automatically generate a variety of questions, which are conducive to processing various subsequent tasks. Chinese question generation is a specific sub‐area of question generation. Due to the characteristics of Chinese question generation tasks, many methods are not suitable for it, and the generated results are either incorrect in word order or invalid expression. In order to address such challenging problem, we propose the conditional pre‐trained attention model termed A Lite BERT Conditional Question Generation (ALBERT‐CQG) for Chinese question generation. Through introducing general background knowledge of the pre‐trained model and conditional information of the given answers, this model has capacity of generating more valid expression. To our knowledge, we are the first to apply the conditional pre‐trained attention model to Chinese question generation tasks. The experimental results on two known benchmark datasets of Chinese question answering show that the ALBERT‐CQG outperforms its recent peers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call