Abstract
Recently, attention-based neural networks (NNs) have been widely used for aspect-level sentiment classification (ASC). Most neural models focus on incorporating the aspect representation into attention, however, the position information of each aspect is not studied well. Furthermore, the existing ASC datasets are relatively small owing to the labor-intensive labeling that largely limits the performance of NNs. In this paper, we propose a position-aware hierarchical transfer (PAHT) model that models the position information from multiple levels and enhances the ASC performance by transferring hierarchical knowledge from the resource-rich sentence-level sentiment classification (SSC) dataset. We first present aspect-based positional attention in the word and the segment levels to capture more salient information toward a given aspect. To make up for the limited data for ASC, we devise three sampling strategies to select related instances from the large-scale SSC dataset for pre-training and transfer the learned knowledge into ASC from four levels: embedding, word, segment and classifier. Extensive experiments on four benchmark datasets demonstrate that the proposed model is effective in improving the performance of ASC. Particularly, our model outperforms the state-of-the-art approaches in terms of accuracy over all the datasets considered.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.