Abstract

Sign language uses gestures instead of speech sounds to communicate. But in general, normal people rarely trying to learn sign language to interact with the deaf community. Recently, there are many sign language recognition system that had been developed. But most of them were implemented using desktop and laptop computer, which is impractical due to its weight and size. This paper presents a prototype of real time static Indonesian sign language recognition using the Android Smart Phone, so it can be used anywhere and anytime. YCrCb color space combined with skin color detection is used to remove the background image and form a segmented image. Detection contour with convex hull algorithms are used to localize and save an area of the hand. Convexity defect algorithms then are used to extract the hand gesture's features using a radiant line from the center of the palm to the fingertips. The classification of hand gesture that performs sign alphabets is accomplished using back propagation neural network algorithm in order to determine a suitable alphabet. The performance test of the system is done by recognizing some variation hand gesture poses for Indonesian sign language alphabet. The results show that the system can detect the position of the user's hand. Furthermore, the system can recognize the alphabet sign from user hand gesture input, reaching 91.66% success rate in testing using Android devices in real time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.