Abstract

In recent years, the increasing variety of nonlinear activation functions (NAFs) in deep neural networks (DNNs) has led to higher computational demands. However, hardware implementation faces challenges such as lack of flexibility, high hardware cost, and limited accuracy. This paper proposes a highly flexible and low-cost hardware solution for implementing activation functions to overcome these issues. Based on the piecewise linear (PWL) approximation method, our method supports NAFs with different accuracy configurations through a customized implementation strategy to meet the requirements in different scenario applications. In this paper, the symmetry of the activation function is investigated, and incorporate curve translation preprocessing and data quantization to significantly reduce hardware storage costs. The modular hardware architecture proposed in this study supports NAFs of multiple accuracies, which is suitable for designing deep learning neural network accelerators in various scenarios, avoiding the need to design dedicated hardware circuits for the activation function layer and enhances circuit design efficiency. The proposed hardware architecture is validated on the Xilinx XC7Z010 development board. The experimental results show that the average absolute error (AAE) is reduced by about 35.6 % at a clock frequency of 312.5 MHz. Additionally, the accuracy loss of the model is maximized to −0.684 % after replacing the activation layer function of DNNs under the PyTorch framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.