Abstract

Machine translation aims to break the language barrier that prevents communication with others and increase access to information. Deaf people face huge language barriers in their daily lives, including access to digital and spoken information. There are very few digital resources for sign language processing. In this article, we present a transfer-based machine translation system for translating Korean-to-Korean Sign Language (KSL) glosses, mainly composed of (1) dictionary-based lexical transfer and (2) a hybrid syntactic transfer based on a data-driven model. In particular, we formulate complicated word reordering problems in syntactic transfer as multi-class classification tasks and propose “syntactically guided” data-driven syntactic transfer. The core part of our study is a neural classification model for reordering order-important constituent pairs with a reordering task that is newly designed for Korean-to-KSL translation. The experiment results evaluated on news transcript data show that the proposed system achieves a BLEU score of 0.512 and a RIBES score of 0.425, significantly improving upon the baseline system performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call