Abstract

Hand gestures help individuals communicate in daily life. Sign languages, however, employ hand motions extensively. People with hearing impairments use sign language. Pattern recognition in computer vision may be used to interpret and translate Arabic Sign Language (ArSL) for deaf and dumb persons using image processing-based software systems. Arab hearing-impaired and deaf persons communicate with Arabic Sign Language Recognition (ArSLR).We propose an image-based ArSLR system employing gesture recognition to enable people with disabilities to engage with the outside world. This work proposes the design and development of an automated Arabic sign language-to-Arabic text translation system. This study extracts stem Arabic words and differentiates related terms. Hand motions capture alphabet letters in the proposed approach. Image isolation occurs after acquisition. Extraction, conclusion, and hand sign pattern characteristics comprise the third step. The fourth phase classifies hand gestures using a powerful classification algorithm. Finally, the sign language letter is interpreted and translated to the Arabic character.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call