Abstract
Multi-Label Text Classification (MLTC) is one of the most import research of natural language processing. Although Deep Learning (DL) models have been widely applied to MLTC, there still exist some drawbacks. First, traditional DL models utilize all the words in the document to construct the embedding vector, while there are many words that affect the classification results. Next, the labels in MLTC have specific semantics, while traditional DL models ignore fine-grained matching signals between words and labels. Then, traditional DL models have difficulty in handling the data imbalance issue in MLTC datasets. In addition, during the training process, small errors in a certain epoch may be amplified with the increase of the number of iterations, resulting in classification errors. To address the above problems, a MLTC model integrating Label Attention and Historical Attention (i.e. LAHA) is proposed. First, a word filter is set up to select important words based on the cosine similarity between words and labels. Next, Document Self Attention (DSA) and Label Attention (LA) are obtained, and DSA-attended LA co-attention (LA-co) and LA-attended DSA co-attention (DAS-co) networks are constructed. Then, the fine-grained matching signals between words and labels are integrated through the adaptive plus of LA-co and DAS-co. At last, Historical Attention is integrated to LAHA, which not only avoids mis-classification caused by minor errors of a certain epoch, but also reduces overfitting to high-frequency labels. Multiple comparative experiments on four benchmark datasets demonstrate that LAHA outperforms the state-of-the-art baseline models and can effectively solve the data imbalance issue in MLTC datasets. Our code is available at https://github.com/sgysgywaityou/LAHA.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.