Abstract
Deployment of modern detection models is difficult for infrared target detection used in robot vision systems due to their heavy computational burden. To alleviate this situation, a simple but efficient channel pruning method is proposed for model acceleration. Specifically, a soft-gated module combined with batch normalization (SGBN) is designed as a standalone layer to substitute the standard batch normalization (BN) layer during training. The conversion between SGBN and BN is easy, and the training overhead introduced is almost negligible after replacement. By controlling the sparsity of the scaling factor in SGBN, unimportant channels with small output are blocked automatically and globally, which is simultaneous with model training. Removing these redundant channels no longer requires fine-tuning, thus significantly speeding up the pruning process. Experiments of pruning different detection models on the infrared dataset show the effectiveness of our method. For example, the parameters and FLOPs of pruned CetnerNet are reduced by 72.70% and 40.20%, respectively, without accuracy loss. The inference speed on the CPU is 12.01ms faster. Extended studies on the classification task also demonstrate its great potential when transferring to other applications.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.