Abstract

Understanding machine translation (MT) quality is becoming more and more important as MT usage continues to rise in the translation industry. The acceptance of MT output based on their performance and, ultimately, how acceptable the translators actually are, have received relatively less attention so far. MT plays a vital role in CLIR systems and their retrieval efficiency is directly proportional to the translation accuracy of the queries. The varied meanings of words, sentences carrying multiple interpretations, and differing grammatical structures across languages contribute to the complexity of the MT task. The lack of structural constraints and the presence of ambiguity further compound the complications especially in case of web queries. The objective of this work is to assess the accuracy of free online translators in translating Hindi web queries. The accuracy of the translators has been evaluated on various metrics, i.e., BLEU, NIST, METEOR, hLepor, CHRF and GLEU. Our findings indicate that the translation accuracy for longer queries is higher than the shorter ones. Overall Google translator’s performance has been found the best while Systran performs the worst with 42.06% performance difference between the two. The present work intends to help researchers in further evaluating and analyzing the MT systems specially in context of web query translation, ultimately leading to improved translation quality and retrieval accuracy in CLIR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call