Abstract

Convolutional neural networks (CNNs) have achieved excellent feature extraction capabilities in remotely sensed hyperspectral image (HSI) classification. This is due to their ability to learn representative spatial and spectral features. However, it is difficult for conventional computers to classify HSIs quickly enough for practical use in many applications, mainly because of the large number of calculations and parameters needed by deep learning-based methods. Although several weight quantization methods achieved remarkable results in network compression, the network acceleration effect is still not significant because a full exploration of the potential of network acceleration brought by network weight quantization is still absent from the literature. In this article, a new step activation quantization method is proposed to constrain the input of the network layer of the CNN so that the data can be represented by low-bit integers. As a result, floating-point operations can be replaced with integer operations to greatly accelerate the forward (inference) step of the network. Specifically, nonlinear uniform quantization is adopted in this work to restrain the input of the CNN in the forward inference of the step activation quantization layer, and two functions (constant and tanh-like) are used in the backpropagation step to avoid gradient vanishing and noise. Our newly proposed step activation quantization acceleration method is applied to a CNN for HSI with two well-known benchmark data sets and the experimental results demonstrate that the proposed method is very effective in terms of both memory savings and computation acceleration, with only a slight decrease in classification accuracy. Specifically, our method reduces memory requirements in <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$13.6\times $ </tex-math></inline-formula> and obtains around <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$10\times {}$ </tex-math></inline-formula> speedup with regard to the original real-valued network version.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.