Abstract

As simple convolution computation is intensively and iteratively performed to extract features from input images, cross-point arrays with resistive random access memory (RRAM) serving as a kernel weight can accelerate the relevant mathematical operations in hardware. However, considering actual RRAM characteristics, either variability or unexpected permanent failure from the filamentary switching mechanism is observed, degrading recognition performance. This study investigates the impact of fault in a conventional kernel structure, where two adjacent columns in the array represent a single weight, on feature extraction using MATLAB. First, the fault types of HfO <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$_{\textit{x}}$</tex-math> </inline-formula> -based multilevel RRAM is categorized. The results reveal that the unidirectional fault of RRAM primarily worsens the accuracy of image recognition. This is because the subtraction of negative weights from positive ones is crucial for identifying the edges of images through convolution operations. Therefore, we exploit a kernel structure, in which a single column dedicated to negative weights is located next to a matrix of positive weights. In addition, we reduce the weight precision for negative weights, while quantizing positive weights to higher bits. By mitigating the subtraction errors achieved by the kernel structure with hybrid precision, we improved fault tolerance, minimizing accuracy degradation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call