Abstract

Training data in a specific domain are often insufficient in the area of text sentiment classifications. Cross-domain sentiment classification (CDSC) is usually utilized to extend the application scope of transfer learning in text-based social media and effectively solve the problem of insufficient data marking in specific domains. Hence, this paper aims to propose a CDSC method via parameter transferring and attention sharing mechanism (PTASM), and the presented architecture includes the source domain network (SDN) and the target domain network (TDN). First, hierarchical attentional network with pre-training language model on training data, such as global vectors for word representation and bidirectional encoder representations from transformers (BERT), are constructed. The word and sentence levels of parameter transferring mechanisms are introduced in the model transfer. Then, parameter transfer and fine-tuning techniques are adopted to transfer network parameters from SDN to TDN. Moreover, sentiment attention can serve as a bridge for sentiment transfer across different domains. Finally, word and sentence level attention mechanisms are introduced, and sentiment attention is shared from the two levels across domains. Extensive experiments show that the PTASM-BERT method achieves state-of-the-art results on Amazon review cross-domain datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call