Abstract

Deaf and hard of hearing people use sign language to communicate. People around mute and deaf people have difficulty communicating with each other if they do not understand sign language. This problem has prompted many researchers to conduct studies on sign language translation. However, there is a lack of compilation of SLR on this topic. Therefore, this paper aims to provide a thorough literature review of previous studies on sign language to text translation based on the vision method. PRISMA (Preferred Reporting Items to writing a standard Systematic Review and Meta-Analyses) is used in this systematic review. Two primary databases, Web of Science and Scopus, have been used to search for relevant articles and resources included in this systematic literature review. Based on the outcome of the systematic review of the topic, the primary studies on sign language translation systems were conducted using self-generated datasets more than public datasets. More static action sign language was studied compared to dynamic action sign language. For the type of recognition, more alphabet sign language was studied compared to digit, word, or sentence sign language. Other than that, most studies used digital cameras rather than Microsoft Kinect or a webcam. The most used classification method was Convolution Neural Network (CNN). The study is intended to guide readers and researchers for future research and knowledge enhancement in the field of sign language recognition.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.