Abstract

Artificial Neural Networks (ANNs) have amassed unprecedented success in information processing ranging from image recognition to time series prediction. The success can largely be attributed to the availability of large datasets for training and the increased complexity of the models. Unfortunately, for some applications only a limited amount of samples is available for training. Fewer training samples increases the risk of over-fitting and poor generalization especially in high complexity models. Moreover, complex models with a large number of trainable parameters require more energy to train and optimize compared to simpler ones. In this paper, to the best of our knowledge, we propose the first use of ANNs for Early Stage Alzheimer Disease classification (ES-AD) from the handwriting (HW). We propose using a framework for building Recurrent Neural Networks (RNNs) known as Reservoir Computing (RC), both numerically and experimentally, that simplifies training by optimizing the output layer only. We also propose the Bidirectional Long Term Short Term (BiLSTM) and Convolutional Neural Network (CNN) methods for the sake of comparison. For a fairer comparison, we not only consider the accuracies but also the energy costs incurred to obtain the respective accuracies in order to assess the accuracy-efficiency trade-off. Our numerical and experimental results show that RC yields a classification accuracy of 85%, which is 3% worse than that of BiLSTM and 2% better than that of CNN, at a relatively lower training and significantly lower inference costs. We hope that our findings highlight the importance of examining the accuracy-efficiency trade-off of various models in the community in order to reduce the overall impact of ANNs training on the environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call