Abstract

To provide explainable and accurate aspect terms and the corresponding aspect–sentiment detection, it is often useful to take external domain-specific knowledge into consideration. In this work, we propose a knowledge-enabled language representation model BERT for aspect-based sentiment analysis. Specifically, our proposal leverages the additional information from a sentiment knowledge graph by injecting sentiment domain knowledge into the language representation model, which obtains the embedding vectors of entities in the sentiment knowledge graph and words in the text in a consistent vector space. In addition, the model is capable of achieving better performance with a small amount of training data by incorporating external domain knowledge into the language representation model to compensate for the limited training data. As a result, our model is able to provide explainable and detailed results for aspect-based sentiment analysis. Experimental results demonstrate the effectiveness of the proposed method, showing that the knowledge-enabled BERT is an excellent choice for solving aspect-based sentiment analysis problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call