Abstract

Edge computing has been shown to be a promising solution that could relax the burden imposed onto the network infrastructure by the increasing amount of data produced by smart devices. However, reconfigurable ultra-low power computing architectures are needed. RRAM devices together with the material implication logic (IMPLY) are a promising solution for the development of low-power reconfigurable logic-in-memory (LiM) hardware. Nevertheless, traditional approaches suffer from several issues introduced by the circuit topology and device non-idealities. Recently, SIMPLY, a smart LiM architecture based on the IMPLY, has been proposed and shown to solve the common issues of traditional architectures. Here, we use a physics-based RRAM compact model calibrated on three RRAM technologies to further analyze the performance of SIMPLY in typical operating conditions, when the repeated execution of logic operation on the same group of devices is considered. The results show that, compared to the conventional IMPLY architecture, SIMPLY spares more than 40% of the high voltage pulses on average even when complex operations are considered (e.g., the 1-bit half adder). We also show how SIMPLY can implement the set of operations required for the implementation of Binarized Neural Networks (BNN) and benchmark its performance against other memristor-based BNN in-memory accelerator from the literature. The results suggest that our approach is more than two orders of magnitude efficient compared to the state of the art reconfigurable in-memory computing approach and could potentially reach the performance of specialized BNN analog hardware accelerators with appropriate device-circuit co-design strategies.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.