Abstract
ABSTRACTThis paper introduces an enhanced BERT‐DPCNN model for the task of Chinese news text classification. The model addresses the common challenge of balancing accuracy and computational efficiency in existing models, especially when dealing with large‐scale, high‐dimensional text data. To tackle this issue, the paper proposes an improved BERT‐DPCNN model that integrates BERT's pre‐trained language model with DPCNN's efficient convolutional structure to capture deep semantic information and key features from the text. Additionally, the paper incorporates the zebra optimization algorithm (ZOA) to dynamically optimize the model's hyperparameters, overcoming the limitations of manual tuning in traditional models. By automatically optimizing hyperparameters such as batch size, learning rate, and the number of filters through ZOA, the model's classification performance is significantly enhanced. Experimental results demonstrate that the improved ZOA‐BERT‐DPCNN model outperforms traditional methods on the THUCNEWS Chinese news dataset, not only verifying its effectiveness in news text classification tasks but also showcasing its potential to enhance classification performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Concurrency and Computation: Practice and Experience
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.