Abstract

Sentiment analysis area has become more and more popular due to information and opinion overload (especially in social networks). With growth of efficient and high accuracy methods of artificial intelligence, it is easier to obtain satisfactory results using opinion mining. Usually, more complex structure, and more sophisticated methods give us better results, but it is hard to understand why such a result was returned. Many of these methods are black-box models – it is a reason for decreasing trust in the system. To manage the problem, explainable methods were developed. This paper presents a systematic survey of explainable artificial intelligence methods that are used in sentiment analysis area. We have analyzed and classified existing approaches to Explainable Sentiment Analysis (XSA) and indicated trends and challenges in this area.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call