Abstract

Neural machine translation (NMT) is a fast-evolving MT paradigm and showed good results, particularly in large training data circumstances, for several language pairs. In this paper, we have utilized Sanskrit to Malayalam language pair neural machines translation. The attention-based mechanism for the development of the machine translation system was particularly exploited. Word sense disambiguation (WSD) is a phenomenon for disambiguating the text to let the machine infer the proper definition of the particular word. Sequential deep learning approaches such as a recurrent neural network (RNN), a gated recurrent unit (GRU), a long short term memory (LSTM), and a bi-directional LSTM (BLSTM) were used to analyze the tagged data. By adding morphological elements and evolutionary word sense disambiguation, the suggested common character-word embedding-based NMT model gives a BLEU score of 38.58 which was higher than the others.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call