Abstract

In the framework of Alzheimer’s disease prediction systems, it is widely agreed that handwriting seems to be one of the first skills to be influenced by the onset of such a disease. In the large majority of cases, the above systems consider information relating to the dynamics of the handwriting process, directly derived from online handwriting samples. This kind of features, however, are not able to capture the alterations in the shape, size and thickness of the handwritten traits, which may be produced by the alterations in motor control due to neurodegenerative disorders. Following this line of thought, in a previous study we combined shape and dynamic information by generating synthetic color images from online handwriting samples, where the color of each elementary trait encodes, in the three RGB channels, the dynamic information associated to that trait. Finally, we exploited the capability of Deep Neural Networks to automatically extract features from raw images, following the Transfer Learning approach. The results obtained with this approach did not show significant improvements compared to those obtained with the use of dynamic information only, probably because approximating the original traits with straight lines of predefined thickness results in a loss of information on their actual shape and thickness. Moving from these considerations, the purpose of our study is to verify whether automatically extracting features directly from offline handwriting images, thus considering the original shape of the handwritten trace, could provide better results. Again, we exploited the capability of Deep Neural Networks to automatically extract features from raw images. The preliminary experimental results confirmed the effectiveness of the proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call