Abstract

Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) were used to handle natural language tasks in early days, but the Transformer model changed that. The Bidirectional Encoder Representations from Transformers (BERT) model is another optimization based on the Transformer model, which directly makes the performance of the NLP model reach an unprecedented height. In order to distinguish the emotion classification model with the best processing effect in the field of e-commerce reviews, the BERT model is fine-tuned based on the mobile e-commerce review data, and then input to another deep learning models(such as CNN,RNN) as embedding. Finally, we compare the training effect of several current deep learning models, such as BERT, BERT-RNN and BERT-CNN. Experimental results show that the BERT-CNN model performs best in the binary classification of e-commerce review text sentiment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call