Abstract

This paper comprehensively analyses dual integration of approximate random weight generator (ARWG) and computation-in-memory for event-based neuromorphic computing. ARWG can generate approximate random weights and perform multiply-accumulate (MAC) operation for reservoir computing (RC) and random weight spiking neural network (SNN). Because of using device variation to generate random weights, ARWG does not require any random number generators (RNGs). Because RC and random weight SNN allow approximate randomness, ARWG only needs to generate approximate random weights, which does not require error-correcting code to correct weights to make the randomness accurate. Moreover, ARWG has a read port for MAC operation. In this paper, the randomness of random weights generated by the proposed ARWG is evaluated by Hamming distance and Hamming weight. As a result, this paper reveals that the randomness required for ARWG is much lower than that for physically unclonable functions and RNGs, and thus the proposed ARWG achieves high recognition accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call