Abstract

Discourse Parsing and Sentiment Analysis are two fundamental tasks in Natural Language Processing that have been shown to be mutually beneficial. In this work, we design and compare two Neural Based models for jointly learning both tasks. In the proposed approach, we first create a vector representation for all the text segments in the input sentence. Next, we apply three different Recursive Neural Net models: one for discourse structure prediction, one for discourse relation prediction and one for sentiment analysis. Finally, we combine these Neural Nets in two different joint models: Multi-tasking and Pre-training. Our results on two standard corpora indicate that both methods result in improvements in each task but Multi-tasking has a bigger impact than Pre-training. Specifically for Discourse Parsing, we see improvements in the prediction of the set of contrastive relations.

Highlights

  • This paper focuses on studying two fundamental NLP tasks, Discourse Parsing and Sentiment Analysis

  • It has been suggested that the information extracted from Discourse Trees can help with Sentiment Analysis (Bhatia et al, 2015) and likewise, knowing the sentiment of two pieces of text might help with the identification of discourse relationships between them (Lazaridou et al, 2013)

  • We find that the improvement of Multi-tasking system in Relation prediction is mainly for the Contrastive set of relations, which confirms our hypothesis that knowing the sentiment of two text spans can help narrow down the choice of discourse relations that holds between them

Read more

Summary

Introduction

This paper focuses on studying two fundamental NLP tasks, Discourse Parsing and Sentiment Analysis. In this sentence, the Discourse Unit “There are slow and repetitive parts,” holds a “Contrast” relationship with “but it has just enough spice to keep it interesting.”. For example in the movie review excerpt shown, the phrase “There are slow and repetitive parts” has a negative sentiment When it is combined with the positive phrase “but it has just enough spice to keep it interesting”, it results in an overall positive sentence. We find that the improvement of Multi-tasking system in Relation prediction is mainly for the Contrastive set of relations, which confirms our hypothesis that knowing the sentiment of two text spans can help narrow down the choice of discourse relations that holds between them

Previous Work
Corpora
Proposed Joint Model
Learning Text Embeddings
Neural Net Models
Joining Neural Nets
Training and Evaluating the Models
Comparison With Previous Work
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call