Abstract
As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns and neurons to learn the temporal patterns within the sequence. However, the conventional HTM model compacts the input into two naive column states—active and nonactive, and uses a fixed learning strategy. This simplicity limits the representation capability of HTM and ignores the impacts of active columns on learning the temporal context. To address these issues, we propose a new HTM algorithm based on activation intensity. By introducing the column activation intensity, more useful and fine-grained information from the input is retained for sequence learning. Furthermore, a self-adaptive nonlinear learning strategy is proposed where the synaptic connections are dynamically adjusted according to the activation intensity of columns. Extensive experiments are carried out on two real-world time-series datasets. Compared to the conventional HTM and LSTM model, our method achieved higher accuracy and less time overhead.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.