Abstract

Text classification has been a fundamental problem in the realm of Natural Language Processing (NLP), and a variety of approaches have been proposed with the development of deep learning. Despite recent progress, most existing approaches deal with the problem of multi-class text classification in a flat way, assuming that text labels are semantically independent. This assumption doesn’t always hold in realistic settings, since there usually exist hierarchical or within-layer dependencies in the latent label space, especially when we consider larger label sets. In this paper, we propose a label clustering algorithm to exploit the underlying structure of label relations, and express the stacked concept relationships in the form of a two-layer label space. Next, we present two different neural network structures to capture inter-layer and intra-layer label relations. The first model HSNN organizes a group of local classifiers in a hierarchical way according to the exploited label space, while the second model LSNN takes advantages of text representations in different granularity levels and the bidirectional inferences with recurrent connections to make predictions. Finally, we evaluate our methods on three benchmark datasets. The results empirically demonstrate that both models are capable of leveraging the exploited label relations to improve text classification performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.