Abstract

One challenge that social media platforms are facing nowadays is hate speech. Hence, automatic hate speech detection has been increasingly researched in recent years - in particular with the rise of deep learning. A problem of these models is their vulnerability to undesirable bias in training data. We investigate the impact of political bias on hate speech classification by constructing three politically-biased data sets (left-wing, right-wing, politically neutral) and compare the performance of classifiers trained on them. We show that (1) political bias negatively impairs the performance of hate speech classifiers and (2) an explainable machine learning model can help to visualize such bias within the training data. The results show that political bias in training data has an impact on hate speech classification and can become a serious issue.

Highlights

  • Social media platforms, such as Twitter and Facebook, have gained more and more popularity in recent years

  • The experiment shows that the politically biased classifiers perform worse than the politically neutral one, and that political bias in training data can lead to an impairment of hate speech detection (RQ1)

  • Concerning RQ2, we show that explainable machine learning (ML) models can help to identify and to visualize a political bias in training data

Read more

Summary

Introduction

Social media platforms, such as Twitter and Facebook, have gained more and more popularity in recent years. One reason is their promise of free speech, which obviously has its drawbacks. With the rise of social media, hate speech has spread on these platforms as well (Duggan, 2017). Due to the enormous amounts of posts and comments produced by the billions of users every day, it is impossible to monitor these platforms manually. Advances in machine learning (ML), show that this technology can help to detect hate speech — currently with limited accuracy (Davidson et al, 2017; Schmidt and Wiegand, 2017)

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call