Abstract

Communication is needed to interact and socialize in order to connect with the environment and other people. In general, communication uses spoken or written words. However, in some cases in the community there are some people who cannot communicate verbally due to physical limitations such as deaf and speech impaired. Usually, they use nonverbal communication such as body movements and this communication is commonly referred to as sign language. The sign language method is used to spell or pronounce words. However, not everyone can understand the sign language used by the deaf and mute, so a system or tool is needed to bridge communication between the deaf or mute and normal people. One solution that can be offered is the use of computer technology as a tool to identify sign language. The technology is in the form of an automatic language translator system design with processing input images in the form of letter classes A to E, I, You and I Love You using the Convolutional Neural Network (CNN) architecture which by using this method the accuracy value can reach 99.82%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.