Abstract

Multi-label text classification (MLTC) is a fundamental yet challenging task in natural language processing. Existing MLTC models mostly learn text representations and label correlations, separately; while the instance-level correlation, which is crucial for the classification is ignored. To rectify this, we propose a new multi-label contrastive learning model, that captures instance-level correlations, for the MLTC task. Specifically, we first learn label representations by using Graph Convolutional Network (GCN) on label co-occurrence graphs. We next learn text representations by taking label correlations into consideration. Through an attention mechanism, instance-level correlation can be established. To better utilize label correlations, we propose a new contrastive learning model, whose learning is guided by a new learning objective, to further refine label representations. We finally implement a k-NN mechanism, that identifies k nearest neighbors of a given text for final prediction. Intensive experimental studies over benchmark multi-label datasets demonstrate the effectiveness of our approach.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.