Abstract

With the explosive growth of electronic data, text has become one of the most important information carriers, and it has become the most common form of data display. Text classification involves many fields such as artificial intelligence and pattern recognition, so it has important academic research significance and commercial value. How to make the computer accurately locate the effective information in the text and realize automatic classification has become one of the research hotspots. In this study, we propose the BERTGACN model, which combines the large-scale pre-training model and the double-tower structure Graph Neural Networks (GNNs) model for text classification tasks. The BERTGACN model creates a heterogeneous graph for representing documents and words as nodes, respectively. By jointly training the Bidirectional Encoder Representations from Transformers (BERT) module, Graph Convolutional Network (GCN) module and Graph Attention Network (GAT) module, the BERTGACN model effectively learns the structural information of the graph and the association information between nodes, which enhances its text classification ability. Experiments show that the BERTGACN model achieves better results than previous models in a wide range of text classification tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.