Abstract

As a case study, this paper presents an investigation on gender bias in neural machine translation with Naver Papago and Google Translate. In spite of the remarkable progress in machine translation system, there are remaining challenging tasks, especially the gender bias in machine translation and processing itself. This kind of phenomenon threatens the fairness of a translation system and runs the risk of amplifying biases in machine translation. Under the circumstances, this study examines gender bias in neural machine translation system with Google Translate and Naver Papago. As for the gender bias evaluation in two machine translation, this paper makes English-Korean and Korean-English language sets using a lexicon set of adjectives. Also, this study purposes to compare Naver Papago and Google Translate. Upon the qualitative examination, the findings revealed that Naver Papago and Google Translate are significantly prone to gender-biased translation and biased results against women are much more frequent. Also, Naver Papagoseems to be more abusive against women. On the other hand, Google Translate seems to be more balanced than Naver Papago, using gender-neutral vocabulary and structure. Moreover, Naver papago and Google Translate exhibit a tendency toward male defaults typically associated with unbalanced gender distribution or stereotypes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call