Abstract
AbstractIn the field of text classification, news text classification has always been the focus of research, and it is quite difficult. With the gradual maturity of deep learning technology and the advent of the information explosion era, the traditional text classification method has been unable to meet people's needs for rapid, efficient, and accurate news classification. The neural network based on bidirectional encoder representations from transformers (BERT) in deep learning is very suitable for news text classification. However, the random mask masking method adopted by BERT model does not incorporate the knowledge information of Chinese language, resulting in the problem of low classification accuracy. Based on this problem, this paper proposes a news topic classification method based on the ERNIE model of multi-stage fusion knowledge masking strategy. When the Chinese language knowledge information fusion model deals with the Chinese news text classification problem, the BERT model, BERT-CNN model, and ConvBERT model are compared. The results show that ERNIE model is superior to other models in the task of news text classification.KeywordsNatural language processingText classificationDeep learningKnowledge enhancementGroeb modelBERT model
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.