Abstract

Aspect-based sentiment analysis (ABSA) has become a popular research topic in recent years due to its strong function of breaking down text into aspects and identifying sentiment polarity towards a specific target, generating a significant amount of discussion among researchers. Motivated by recent work of application of sentence-pair classification task into ABSA, this article discusses how to further fine-tune the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model and obtain the results on SentiHood dataset. In a contrast to the previous work, this article considers that the sentiment analysis has relations to every single word in each sentence and shows the process of modifying the forward network in BERT to create self-attention between words. The proposed model demonstrates a certain degree of improvement in some aspects, in particular to aspect category detection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call