Abstract
ABSTRACTSign language can be used to facilitate communication with and between deaf or hard of hearing (Deaf/HH). With the advent of video streaming applications in smart TVs and mobile devices, it is now possible to use sign language to communicate over worldwide networks. In this article, we develop a prototype assistive device for real-time speech-to-sign translation. The proposed device aims at enabling Deaf/HH people to access and understand materials delivered in mobile streaming videos through the applications of pipelined and parallel processing for real-time translation, and the application of eye-tracking based user-satisfaction detection to support dynamic learning to improve speech-to-signing translation. We conduct two experiments to evaluate the performance and usability of the proposed assistive device. Nine deaf people participated in these experiments. Our real-time performance evaluation shows the addition of viewer’s attention-based feedback reduced translation error rates by 16% (per the sign error rate [SER] metric) and increased translation accuracy by 5.4% (per the bilingual evaluation understudy [BLEU] metric) when compared to a non-real-time baseline system without these features. The usability study results indicate that our assistive device was also pleasant and satisfying to deaf users, and it may contribute to greater engagement of deaf people in day-to-day activities.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.