Abstract

Machine translation services are a very popular class of Artificial Intelligence (AI) services nowadays but public's trust in these services is not guaranteed since they have been shown to have issues like bias. In this work, we focus on the behavior of machine translators with respect to gender bias as well as their accuracy. We have created the first-of-its-kind virtual environment, called VEGA, where the user can interactively explore translation services and compare their trust ratings using different visuals.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call