Abstract

Despite the great manufactures’ efforts to achieve customer satisfaction and improve their performance, social media opinion mining is still on the fly a big challenge. Current opinion mining requires sophisticated feature engineering and syntactic word embedding without considering semantic interaction between aspect term and opinionated features, which degrade the performance of most of opinion mining tasks, especially those that are designed for smart manufacturing. Research on intelligent aspect level opinion mining (AOM) follows the fast proliferation of user-generated data through social media for industrial manufacturing purposes. Google’s pre-trained language model, Bidirectional Encoder Representations from Transformers (BERT) widely overcomes existing methods in eleven natural language processing (NLP) tasks, which makes it the standard way for semantic text representation. In this paper, we introduce a novel deep learning model for fine-grained aspect-based opinion mining, named as FGAOM. First, we train the BERT model on three specific domain corpora for domain adaption, then use adjusted BERT as embedding layer for concurrent extraction of local and global context features. Then, we propose Multi-head Self-Attention (MSHA) to effectively fuse internal semantic text representation and take advantage of convolutional layers to model aspect term interaction with surrounding sentiment features. Finally, the performance of the proposed model is evaluated via extensive experiments on three public datasets. Results show that performance of the proposed model outperforms performances of recent the-of-the-art models.

Highlights

  • The continuous proliferation of digital social media and increased popularity of e-commerce technologies enlarged the amount of user-generated multimodality data daily

  • Opinion mining techniques can be primarily assorted into three levels of granularities depending on the granularity level adopted for data processing: The associate editor coordinating the review of this manuscript and approving it for publication was Kemal Polat

  • A novel model is proposed for aspect level opinion mining (AOM) based on adapted Bidirectional Encoder Representations from Transformers (BERT) language model, which is pre-trained on relevant domain knowledge corpora, and a novel mechanism to discriminate local and global context features

Read more

Summary

INTRODUCTION

The continuous proliferation of digital social media and increased popularity of e-commerce technologies enlarged the amount of user-generated multimodality data daily. Han et al [15] applied attention mechanisms on top of Bi-GRU for multi-task learning AOM on pre-trained weight for online drug reviews These studies rely on restricted window size for embedding, and cannot make use of semantic information locally or globally. In [16], Yang et al adopted BERT for generating embedding vector for each input sequence and applied the CNN and Bi-GRU to learn and extract relative sentiment patterns in product reviews These reviews were assigned an attention score with SA mechanism; the model trained cannot detect any polarity other than positive or negative and did not address aspect level classification. A novel model is proposed for AOM based on adapted BERT language model, which is pre-trained on relevant domain knowledge corpora, and a novel mechanism to discriminate local and global context features. The concatenation of local and global features can be formulated as shown in equation (7)

FINE-TUNING LAYER
OUTPUT LAYER
MODEL TRAINING
EXPERIMENTS
Findings
CONCLUSION AND FUTURE WORK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call