Abstract

Text sentiment style transfer aims to extract the sentiment words from a sentence and transfer them into another expected sentiment style while retaining the original sentence's content. However, previous works have not achieved satisfactory performance on the text sentiment style transfer task, especially for non-parallel text. In this paper, a novel framework with the attention mechanism and embedding perturbed encoder is proposed to improve the performance of non-parallel text sentiment style transfer. Firstly, the reverse attention mechanism is adopted to disentangle the sentiment style information from the latent representation. And then an embedding perturbed encoder is designed to append an adjustable noise to the embedding space to make the latent representation more semantic. Finally, the attention mechanism is introduced to give different weights for generated words during the decoding process, so that the model can focus on those high-weight words to enhance the quality of sentiment style transfer. Experiments on the corpora of Yelp and IMDB demonstrate that the suggested framework outperforms previous works on the aspects of sentiment style transfer accuracy, content preservation and language fluency.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.