Abstract

Hyperdimensional computing (HDC) is an emerging computational framework that takes inspiration from attributes of neuronal circuits such as hyperdimensionality, fully distributed holographic representation, and (pseudo)randomness. When employed for machine learning tasks such as learning and classification, HDC involves manipulation and comparison of large patterns within memory. Moreover, a key attribute of HDC is its robustness to the imperfections associated with the computational substrates on which it is implemented. It is therefore particularly amenable to emerging non-von Neumann paradigms such as in-memory computing, where the physical attributes of nanoscale memristive devices are exploited to perform computation in place. Here, we present a complete in-memory HDC system that achieves a near optimum trade-off between design complexity and classification accuracy based on three prototypical HDC related learning tasks, namely, language classification, news classification, and hand gesture recognition from electromyography signals. Comparable accuracies to software implementations are demonstrated, experimentally, using 760,000 phase-change memory devices performing analog in-memory computing.

Highlights

  • When designing biological computing systems, nature decided to trade accuracy for efficiency

  • Hyperdimensional computing is a brain-inspired computational framework that is well-suited for the emerging computational paradigm of in-memory computing

  • Due to the inherent robustness of Hyperdimensional computing (HDC) to errors, it was possible to approximate the mathematical operations associated with HDC to make it suitable for hardware implementation, and to use analog in-memory computing without significantly degrading the output accuracy

Read more

Summary

INTRODUCTION

When designing biological computing systems, nature decided to trade accuracy for efficiency. HDC begins with representing symbols with i.i.d. hypervectors that are combined by nearly i.i.d.-preserving operations, namely binding, bundling, and permutation, and stored in associative memories to be recalled, matched, decomposed, or reasoned about This chain implies that failure in a component of a hypervector is not “contagious” and forms a computational framework that is intrinsically robust to defects, variations, and noise. A carbon nanotube field effect transistor-based logic layer was integrated to ReRAMs, improving efficiency further18 These architectures resulted in limited application such as a single language recognition task, or a restricted binary classification version of the same task; their evaluation is based on simulations and compact models derived from small prototypes with only 256 ReRAM cells, or a small 32-bit datapath for hypervector manipulations that results in three orders of magnitude higher latency overhead. We map all operations of HDC either in-memory, or near-memory, and demonstrate their integrated functionality for three distinct applications that are well suited for HDC

THE CONCEPT OF IN-MEMORY HDC
THE ASSOCIATIVE MEMORY SEARCH MODULE
THE N-GRAM ENCODING MODULE a Original IM crossbar
CONCLUSION
COMPETING FINANCIAL INTERESTS
METHODS
Language classification
News classification
Findings
Experiments on associative memory search
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call