American sign language is an important communication way to convey information among the deaf community in North America and is primarily used by people who have hearing or speech impairments. The deaf community faces a struggle in schools and other institutions because they usually consist primarily of hearing people. Besides, deaf people often feel misunderstood by people who do not knowsign language, for example, family members. In the last two decades, researchers have been proposing automatic sign language recognition systems to facilitate the learning of sign language, and nowadays, computer scientists have focused on using artificial intelligence in order to develop a system capable of reducing the communication gap between hearing and deaf people. In this paper, it is proposed a siamese convolutional neural network for American sign language alphabet recognition. This siamese architecture allows the computer to reduce the high interclass similarity and high intraclass variations. The results show that the proposed method outperforms the state of the art systems.
Read full abstract