Abstract

AbstractObtaining valuable information from massive data efficiently has become our research goal in the era of Big Data. Text summarization technology has been continuously developed to meet this demand. Recent work has also shown that transformer-based pre-trained language models have achieved great success on various tasks in NLP. Aiming at the problem of Chinese news text summary generation and the application of Transformer structure on Chinese texts, this paper proposes a Chinese news headline generation model CNsum based on Transformer structure, and tests it on Chinese datasets such as THUCNews. The results of the conducted experiments show that CNsum achieves better ROUGE, BLEU and BERTScore scores than the baseline models, which verifies the outperformance of the model.KeywordsAbstractive summarizationPre-trained language modelSeq2SeqChinese news headlines

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.