Abstract

As a case study, this paper presents an investigation on gender bias in neural machine translation with Naver Papago and Google Translate. In spite of the remarkable progress in machine translation system, there are remaining challenging tasks, especially the gender bias in machine translation and processing itself. This kind of phenomenon threatens the fairness of a translation system and runs the risk of amplifying biases in machine translation. Under the circumstances, this study examines gender bias in neural machine translation system with Google Translate and Naver Papago. As for the gender bias evaluation in two machine translation, this paper makes English-Korean and Korean-English language sets using a lexicon set of adjectives. Also, this study purposes to compare Naver Papago and Google Translate. Upon the qualitative examination, the findings revealed that Naver Papago and Google Translate are significantly prone to gender-biased translation and biased results against women are much more frequent. Also, Naver Papagoseems to be more abusive against women. On the other hand, Google Translate seems to be more balanced than Naver Papago, using gender-neutral vocabulary and structure. Moreover, Naver papago and Google Translate exhibit a tendency toward male defaults typically associated with unbalanced gender distribution or stereotypes.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.