Abstract

Sentiment analysis is one of the significant tasks in processing natural language by a machine. However, it is difficult for a machine to understand the feelings of a person and opinion about a topic. Many approaches have been introduced for analyzing sentiment from long text in recent past. In contrast, these approaches fail to address the small length text problem like Twitter data efficiently. Recent advances in the pre-trained contextualized embeddings like Bidirectional Encoder Representations from Transformers (BERT) show far greater accuracy than traditional embeddings. In this paper, we develop a novel architecture to tune the BERT using a Bidirectional Long Short-Term Memory (Bi-LSTM) model. A task-specific layer is incorporated along with the BERT in the proposed model. Our model extracts sentiment from short texts, especially Twitter data. The extensive experiments show the superiority of our model over state-of-the-art models in sentiment analysis task across several gold standard datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call