Abstract
Brain-inspired Hyperdimensional (HD) computing is a promising solution for energy-efficient classification. HD emulates cognition tasks by exploiting long-size vectors instead of working with numeric values used in contemporary processors. However, the existing HD computing algorithms have lack of controllability on the training iterations which often results in slow training or divergence. In this work, we propose AdaptHD, an adaptive learning approach based on HD computing to address the HD training issues. AdaptHD introduces the definition of learning rate in HD computing and proposes two approaches for adaptive training: iteration-dependent and data-dependent. In the iteration-dependent approach, AdaptHD uses a large learning rate to speedup the training procedure in the first iterations, and then adaptively reduces the learning rate depending on the slope of the error rate. In the data-dependent approach, AdaptHD changes the learning rate for each data point depending on how far off the data was misclassified. Our evaluations on a wide range of classification applications show that AdaptHD achieves 6.9× speedup and 6.3× energy efficiency improvement during training as compared to the state-of-the-art HD computing algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.