Abstract

Abstract Deep convolutional neural networks have achieved great success on image classification. A series of feature extractors learned from CNN have been used in many computer vision tasks. Global pooling layer plays a very important role in deep convolutional neural networks. It is found that the input feature-maps of global pooling become sparse, as the increasing use of Batch Normalization and ReLU layer combination, which makes the original global pooling low efficiency. In this paper, we proposed a novel end-to-end trainable global pooling operator AlphaMEX Global Pool for convolutional neural network. A nonlinear smooth log-mean-exp function is designed, called AlphaMEX, to extract features effectively and make networks smarter. Compared to the original global pooling layer, our proposed method can improve classification accuracy without increasing any layers or too much redundant parameters. Experimental results on CIFAR-10/CIFAR100, SVHN and ImageNet demonstrate the effectiveness of the proposed method. The AlphaMEX-ResNet outperforms original ResNet-110 by 8.3% on CIFAR10+, and the top-1 error rate of AlphaMEX-DenseNet (k = 12) reaches 5.03% which outperforms original DenseNet (k = 12) by 4.0%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.