Abstract
Multi-domain sentiment classification is a challenging topic in natural language processing, where data from multiple domains are applied to improve the performance of classification. Recently, it has been demonstrated that attention neural networks exhibit powerful performance in this task. In the present study, we propose a collaborative attention neural network (CAN). A self-attention module and domain attention module work together in our approach, where the hidden states generated in the self-attention module are fed into both the domain sub-module and sentiment sub-module in the domain attention module. Compared with other attention neural networks, we use two types of attention modules to conduct the auxiliary and main sentiment classification tasks. The experimental results showed that CAN outperforms other state-of-the-art sentiment classification approaches in terms of the overall accuracy based on both English (Amazon) and Chinese (JD) multi-domain sentiment analysis data sets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.