Abstract
The continual development of artificial neural networks (ANNs) on von-Neuman based hardware poses a threat to potential advances in artificial intelligence (AI). At current rates, power usage of such models will rival that of large cities or even supersede the amount of power produced globally. Therefore, it is imperative to make a transition to a more power efficient computing architecture to make way for future advances in AI. The development of neuromorphic hardware has paved a path for a new set of neural networks called spiking neural networks (SNNs) to act as a viable alternative for machine learning tasks on such brain-like hardware. The primary motive of the research is to contribute towards a search for viable applications of SNNs in their current infancy. The project wishes to address whether SNNs can outperform ANNs image classification of handwritten digits on the MNIST dataset. Despite training on non-optimal von Neumann hardware, statistically significant evidence suggests that the SNN outperformed the ANN in terms of the mean accuracy of the task. As a result of the findings, the paper suggests that further research should focus on translating older ANN models to SNN counterparts that outperform on their tasks and finding applications of SNNs on von Neumann hardware that outperform ANNs. Both prior focuses will narrow the gap between ANN and SNN performance, and in doing so makes a transition to more power efficient and uninhibited AI advancement a reality.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.