Abstract

In the current digital era, evaluating the quality of people's lives and their happiness index is closely related to their expressions and opinions on Twitter social media. Measuring population welfare goes beyond monetary aspects, focusing more on subjective well-being, and sentiment analysis helps evaluate people's perceptions of happiness aspects. Aspect-based sentiment analysis (ABSA) effectively identifies sentiments on predetermined aspects. The previous study has used Word-to-Vector (Word2Vec) and long short-term memory (LSTM) methods with or without attention mechanism (AM) to solve ABSA cases. However, the problem with the previous study is that Word2Vec has the disadvantage of being unable to handle the context of words in a sentence. Therefore, this study will address the problem with bidirectional encoder representations from transformers (BERT), which has the advantage of performing bidirectional training. Bayesian optimization as a hyperparameter tuning technique is used to find the best combination of parameters during the training process. Here we show that BERT-LSTM-AM outperforms the Word2Vec-LSTM-AM model in predicting aspect and sentiment. Furthermore, we found that BERT is the best state-of-the-art embedding technique for representing words in a sentence. Our results demonstrate how BERT as an embedding technique can significantly improve the model performance over Word2Vec.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.