Abstract

In this study, we investigated the effects of mastering multiple scripts in handwritten character recognition by means of computational simulations. In particular, we trained a set of deep neural networks on two different datasets of handwritten characters: the HODA dataset, which is a collection of images of handwritten Persian digits, and the MNIST dataset, which contains Latin handwritten digits. We simulated native language individuals (trained on a single dataset) as well as bilingual individuals (trained on both datasets), and compared their performance in a recognition task performed under different noisy conditions. Our results show the superior performance of bilingual networks in handwritten digit recognition in comparison to the monolingual networks, thereby suggesting that mastering multiple languages might facilitate knowledge transfer across similar domains.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call