Abstract
Hand sign language recognition is one of the fundamental steps to overcome the barrier of communication between a deaf-mute and a normal person in the field of computer vision. In this paper, a hand sign language recognition framework is proposed for various Bangla alphabets using Artificial Neural Network (ANN). For that, initially the input image is normalized and the skin area is extracted on the basis of the YCbCr values corresponding to human skin color. The extracted area i.e., hand sign area is converted into a binary image and the gaps in the binary hand sign area are filled through the morphological operations. After that, the boundary edge of the hand sign area is extracted through the canny edge detector and extracts the hand sign region of interest (ROI). Finally, features are extracted from the hand sign ROI using Freeman Chain Code (FCC). The ANN is used for training and classifies the hand sign images. The proposed method is tested using various hand sign images and results are presented to demonstrate the efficiency and effectiveness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.