Abstract

Neural machine translation has been widely adopted in modern machine translation as it brings the state-of-the-art performance to large-scale parallel corpora. For real-world applications, high-quality translation for text in a specific domain is crucial. However, performances of general neural machine models drop when being applied in a specific domain. To alleviate this issue, this paper presents a novel method of machine translation, which explores both model fusion algorithm and logarithmic linear interpolation. The method can improve the performance of in-domain translation model, while preserving or even improving the performance of out-domain translation model. This paper has carried out extensive experiments on proposed translation model using the public United Nations corpus. The bilingual evaluation understudy (BLEU) score of the in-domain corpus and the out-domain corpus reaches 30.27 and 43.17 respectively, which shows a certain improvement over existing methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.