Abstract
Text classification is an important research area in the field of natural language processing. In view of the low efficiency of the traditional CNN and RNN algorithms for multi-label classification of long text due to timing and spatial displacement problems, this paper proposes a long text classification model with improved Transformer attention mechanism, and combines LDA topic classification algorithm to achieve multi-label classification of document length text. Firstly, the paper introduces the industry's solutions to the problem of multi-classification of long texts. Compared with traditional algorithms such as truncation method and pooling method, the PAPER proposes the LDA topic classification model combined with the improved Transformer-XL text classification model to extract text features with fine granularity, so as to classify texts with higher accuracy. Finally, comparative experiments show that the proposed solution has a significant improvement in P value, R value and F1 value compared with the traditional classification method in the field of long text multi-label classification.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.