Abstract

The Aspect-Based Sentiment Classification (ABSC) models often suffer from a lack of training data in some domains. To exploit the abundant data from another domain, this work extends the original state-of-the-art LCR-Rot-hop++ model that uses a neural network with a rotatory attention mechanism for a cross-domain setting. More specifically, we propose a Domain-Independent Word Selector (DIWS) model that is used in combination with the LCR-Rot-hop++ model (DIWS-LCR-Rot-hop++). DIWS-LCR-Rot-hop++ uses attention weights from the domain classification task to determine whether a word is domain-specific or domain-independent, and discards domain-specific words when training and testing the LCR-Rot-hop++ model for cross-domain ABSC. Overall, our results confirm that DIWS-LCR-Rot-hop++ outperforms the original LCR-Rot-hop++ model under a cross-domain setting in case we impose an optimal domain-dependent attention threshold value for deciding whether a word is domain-specific or domain-independent. For a target domain that is highly similar to the source domain, we find that imposing moderate restrictions on classifying domain-independent words yields the best performance. Differently, a dissimilar target domain requires a strict restriction that classifies a small proportion of words as domain-independent. Also, we observe information loss which deteriorates the performance of DIWS-LCR-Rot-hop++ when we categorize an excessive amount of words as domain-specific and discard them.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call