Abstract

With the rapid development of the Internet, the amount of text on the network has shown a rapid growth trend, and the demand for text classification technology is increasing day by day, especially for aspect-level sentiments classification. However, the traditional method mainly relies on time-consuming feature engineering, but due to its context-independent nature, it ignores the rich context information in the text, which greatly reduces the performance in Natural Language Processing (NLP) tasks. Bidirectional Encoder Representations from Transformers (BERT), a pre-trained language models, refreshed records on eleven NLP tasks and became a new baseline model for text classification. Although BERT has been widely used for other NLP tasks, it is rarely used on aspect-level sentiments classification. We use the BERT output containing rich contextual information as the input of the optimized DNN network, and use the DNN network for further classification to obtain better performance in aspect-level sentiment classification. We performed comparative experiments on three public datasets. Compared with other latest baseline models, our model has better performance in aspect-level sentiments classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call