Abstract

Aspect-Based Sentiment Analysis (ABSA) has become a trending research domain due to its ability to transform lives as well as the technical challenges involved in it. In this paper, a unique set of rules has been formulated to extract aspect-opinion phrases. It helps to reduce the average sentence length by 84% and the complexity of the text by 50%. A modified rank-based version of Term-Frequency - Inverse-Document-Frequency (TF-IDF) has been proposed to identify significant aspects. An innovative word representation technique has been applied for aspect categorization which identifies both local as well as global context of a word. For sentiment classification, pre-trained Bidirectional Encoder Representations from Transformers (BERT) has been applied as it helps to capture long-term dependencies and reduce the overhead of training the model from scratch. However, BERT has drawbacks like quadratic drop in efficiency with an increase in sequence length which is limited to 512 tokens. The proposed methodology mitigates these drawbacks of a typical BERT classifier accompanied by a rise in efficiency along with an improvement of 8% in its accuracy. Furthermore, it yields enhanced performance and efficiency compared to other state-of-the-art methods. The assertions have been established through extensive analysis upon movie reviews and Sentihood data-sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call