Abstract

Text classification has been attracting increasing attention with the growth of textual data created on the Internet. Great progress has been made by deep neural networks for domains where a large amount of labeled training data is available. However, providing sufficient data is time-consuming and labor-intensive, establishing substantial obstacles for expanding the learned models to new domains or new tasks. In this paper, we investigate the transferring capability of capsule networks for text classification. Capsule networks are able to capture the intrinsic spatial part-whole relationship constituting domain invariant knowledge that bridges the knowledge gap between the source and target domains (or tasks). We propose an iterative adaptation strategy for cross-domain text classification, which adapts the source domain to the target domain. A fast training method with capsule compression and class-guided routing is designed to make the capsule network more efficient in computation for cross-domain text classification. We first conduct experiments to evaluate the performance of the capsule network on six benchmark datasets for generic text classification. The capsule networks outperform the compared models on 4 out of 6 datasets, suggesting the effectiveness of the capsule networks for text classification. More importantly, we demonstrate the transferring capability of the proposed cross-domain capsule network (TL-Capsule) by applying it to two transfer learning applications: single-label to multi-label text classification and cross-domain sentiment classification. The experimental results show that capsule networks consistently and substantially outperform the compared methods for both tasks. To the best of our knowledge, this is the first work that empirically investigates the transferring capability of capsule networks for text modeling.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.