Abstract

One of the most useful applications of Natural Language Processing is language translation. However, with the invention of machine translation, users who are comfortable in their local languages can understand, study, or search for content produced in any language of the world. Neural machine translation (NMT) is a new machine translation approach that was developed recently. NMT can benefit from the use of dense word representations, known as Word Embeddings. In this paper we have studied the effect of 3 pre-trained word embeddings, GloVe, Word2Vec and FastText (for the languages English and Hindi) on English and Hindi neural machine translation systems. The performance of the models have been evaluated using the BLEU metric. In the end the proposed models have been compared with Google translate as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call