Abstract
In recent years, spiking neural network (SNN) have attracted much attention due to their low energy consumption and have achieved remarkable results in the fields of vision and information processing. However, the application of SNNs in the field of natural language processing (NLP) is still relatively small. Given that current popular large-scale language models rely on huge arithmetic and energy consumption, it is of great practical importance to explore SNN-based approaches to implement NLP tasks in a more energy-efficient way. This paper investigates the conversion method from LSTM network to SNN and compares the performance of the original LSTM network with the converted SNN in a text classification task. The paper experimentally compares the accuracy of different LSTM-based models using different methods on the same dataset. The experimental results show that the converted SNN is able to achieve similar performance to the original LSTM network with significantly lower power consumption in text classification tasks on multiple datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.