Abstract

SummaryThe development of lightweight networks makes neural networks more efficient to be widely applied to various tasks. Considering the deployment of hardware like edge devices and mobile phones, we prioritize lightweight networks. However, their accuracy has always lagged far behind SOTA networks. In this article, we present a simple yet effective activation function, called WReLU, to improve the performance of lightweight networks significantly by adding a residual spatial condition. Moreover, we use a strategy to switch activation functions after determining which convolutional layer to use. We perform experiments on ImageNet 2012 classification dataset in CPU, GPU, and edge devices. Experiments demonstrate that WReLU improves the accuracy of classification significantly. Meanwhile, our strategy balances the effect of additional parameters and multiply accumulate. Our method improves the accuracy of SqueezeNet and SqueezeNext by more than 5% without increasing extensive parameters and computation. For the lightweight network with a large number of parameters, such as MobileNet and ShuffleNet, there is also a significant improvement. Additionally, the inference speed of most lightweight networks using our WReLU strategy is almost the same as the baseline model on different platforms. Our approach not only ensures the practicability of the lightweight network but also improves its performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.