Abstract

During recent years a large number of computer aided applications have been developed to help the disabled people. This has improved the communication between the able and the hearing impaired community. An intelligent signed alphabet recognizer can work as an aiding agent to translate the signs to words (and also sentences) and vice versa. To achieve this goal few steps to be followed, among which the first complicated task is to recognize the sign-language alphabets from hand gesture images. In this paper, we propose a system that is able to recognize American Sign Language (ASL) alphabets from hand gesture with average 93.23% accuracy. The classification is performed with fuzzy-c-mean clustering on a lower dimensional data which is acquired from the Principle Component Analysis (PCA) of Gabor representation of hand gesture images. Out of the top 20 Principle Components (PCs) the best combination of PCs is determined by finding the best fuzzy cluster for the corresponding PCs of the training data. The best result is obtained from the combination of the fourth to seventh principle components.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.