Abstract

China’s foreign exchanges are becoming increasingly frequent, among which current political news, as the recent or ongoing relevant reports of facts in the national political life, plays a great role in the field of information transmission. However, there are a large number of proper nouns as well as long and complex sentences in current political news, so the news translation through traditional machine translation tends to have low accuracy and poor usability. Based on this situation, this paper proposes a translation model of Chinese current political news based on Attention. It uses the classic Long Short Term Memory (LSTM) model and introduces the Attention Mechanism to improve the traditional Encoder-Decoder framework. Through the training of parallel corpora, constraints are established for the proper nouns of current political news, thereby improving the overall translation accuracy. The experiment shows that the translation model used in this paper has higher accuracy than neural machine translation (NMT).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call