Abstract
Sentiment analysis is one of the significant tasks in processing natural language by a machine. However, it is difficult for a machine to understand the feelings of a person and opinion about a topic. Many approaches have been introduced for analyzing sentiment from long text in recent past. In contrast, these approaches fail to address the small length text problem like Twitter data efficiently. Recent advances in the pre-trained contextualized embeddings like Bidirectional Encoder Representations from Transformers (BERT) show far greater accuracy than traditional embeddings. In this paper, we develop a novel architecture to tune the BERT using a Bidirectional Long Short-Term Memory (Bi-LSTM) model. A task-specific layer is incorporated along with the BERT in the proposed model. Our model extracts sentiment from short texts, especially Twitter data. The extensive experiments show the superiority of our model over state-of-the-art models in sentiment analysis task across several gold standard datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.