Abstract

Benefiting from area and power efficiency, memristors enable the development of neural network analog-to-digital converter (ADC) to break through the limitations of conventional ADCs. Although some memristive ADC (mADC) architectures have been proposed recently, the current research is still at an early stage, which is mainly on the simulation and requires numerous target labels to train the synapse weights. In this paper, we propose a pipelined Hopfield neural network mADC architecture and experimentally demonstrate that such mADC has the capability of self-adaptive weight tuning. The proposed training algorithm is an unsupervised method originated from the random weight change (RWC) algorithm, which is modified to reduce the complexity of error feedback circuit to make it more hardware friendly. The synapse matrix could be adapted to the 1T1R crossbar array. For an 8-bit two-stage pipelined mADC, the proposed architecture in the simulation could achieve 7.69 fJ/conv FOM, 7.90 ENOB, 0.1 LSB INL, and 0.1 LSB DNL. And the experimental performance only achieves 1.56 pJ/conv FOM, 7.59 ENOB, 0.21 LSB INL, and 0.29 LSB DNL, which is mainly limited by the comparator’s switching time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.