Abstract

Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call