Abstract

Though great progress has been made in the Aspect-Based Sentiment Analysis(ABSA) task through research, most of the previous work focuses on English-based ABSA problems, and there are few efforts on other languages mainly due to the lack of training data. In this paper, we propose an approach for performing a Cross-Lingual Aspect Sentiment Classification (CLASC) task which leverages the rich resources in one language (source language) for aspect sentiment classification in a under-resourced language (target language). Specifically, we first build a bilingual lexicon for domain-specific training data to translate the aspect category annotated in the source-language corpus and then translate sentences from the source language to the target language via Machine Translation (MT) tools. However, most MT systems are general-purpose, it non-avoidably introduces translation ambiguities which would degrade the performance of CLASC. In this context, we propose a novel approach called Reinforced Transformer with Cross-Lingual Distillation (RTCLD) combined with target-sensitive adversarial learning to minimize the undesirable effects of translation ambiguities in sentence translation. We conduct experiments on different language combinations, treating English as the source language and Chinese, Russian, and Spanish as target languages. The experimental results show that our proposed approach outperforms the state-of-the-art methods on different target languages.

Highlights

  • Aspect Sentiment Classification (ASC) aims to identify fine-grained polarity towards a specific aspect category

  • BERT (Devlin et al [19], 2018) translated the source-language training set into the target language and fine-tuning BERT based on a translated dataset for CLASC

  • The results show that mBERT is able to perform cross-lingual generalization well and achieves an almost comparable performance to Machine Translation (MT)-based methods, which proves that multilingual representations are effective for cross-lingual tasks

Read more

Summary

Introduction

Aspect Sentiment Classification (ASC) aims to identify fine-grained polarity towards a specific aspect category (i.e., aspect). Redundant tokens are introduced, which would influence the distribution of word representations, and their sentiment polarity does not necessarily hold in translation contexts because of the imperfect natural language generation. These disadvantages degrade the performance of CLASC seriously because the sentiment polarity of aspect is related to specific words or phrases rather than the whole sentence. Most pre-trained cross-lingual language models rely on the large-scale task-unrelated parallel corpora and the general-purpose representations are far from satisfactory for the downstream task. A innovative approach called RTCLD is proposed for the CLASC task, which distills aspect sentiment knowledge from the source to model aspect-aware representations in the target

Cross-Lingual Sentiment Classification
Cross-Lingual Aspect-Level Sentiment Classification
Proposed Method
Target-Sensitive Adversarial Learning
Optimization Strategy
Experimental Settings
Baselines
Experimental Results
Ablation Studies
Error Analysis
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.