Abstract

Complicated temporal patterns can provide important information for accurate time series forecasting. Existing long short-term memory (LSTM) model with attention mechanism have achieved significant performance. However, the exponential decay of long-term memory of LSTM has not be resolved yet in these efforts, remaining a longstanding open problem in recurrent nature. This problem exhibits a bottleneck which restricts the performance of existing studies. Recently, spiking neural networks (SNNs) have shown high efficiency in capturing temporal patterns via the surrogate gradient (SG) method to resolve this issue. However, the concept-drift environment makes it impossible to pre-set the variance into the standard SG method due to time-varying data distribution. In this paper, we propose a novel adaptive and hybrid spiking (AHS) module embedded LSTM, collaborating with two attention mechanisms (called HSN-LSTM) to resolve above-mentioned problems. First, the AHS module is analyzed theoretically can remain long-term memory. Moreover, our smooth SG method avoids pre-setting of variance, which is not sensitive in the above scenarios. Besides, we use the negative log-likelihood function to adjust the attention score for alleviating the negative impact from the concept-drift. Experiment results show the HSN-LSTM outperformed the state-of-the-art models on several multivariate time series datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.