Abstract

AbstractThe purpose of cross‐domain opinion classification is to leverage useful information acquired from the source domain to train a classifier for opinion classification in the target domain, which has a huge amount of unlabeled data. An opinion classifier trained on a specific domain usually acts poorly, when directly employed to another domain. Annotating the data for all the domains is a laborious and costly process. The majority of available approaches are centered on identifying invariant features among domains. Unluckily, they are unable to properly capture the context within the sentences and better utilization of unlabeled data. To properly address this issue, we propose an aspect‐based attention model for cross‐domain opinion classification. By incorporating knowledge of aspects and sentences, the proposed model provides a transfer mechanism for better‐transferring opinions among domains. We introduce two learning networks, first learning network aims to recognize the shared features between domains, while the purpose of the second learning network is to extract the information from the aspects by utilizing shared words as a bridge. We benefit from BERT and bidirectional gated recurrent unit to get a deep understanding and deep level semantic information of the text. Further, the joint attention learning mechanism is performed for these two learning modules so that the aspects and sentences can impact the resulting opinion expression. In addition, we introduce a gradient reversal layer to obtain invariance features. The comprehensive experiments are performed on Amazon multidomain product datasets and show the effectiveness and significance of the proposed model over state‐of‐the‐art techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call