Abstract

The past few years have witnessed the fast development of regularization methods for deep learning models such as fully-connected deep neural networks (DNNs) and convolutional neural networks (CNNs). Part of previous methods mainly consider to drop features from input data and hidden layers, such as Dropout, Cutout and DropBlocks. DropConnect select to drop connections between fully-connected layers. By randomly discard some features or connections, the above mentioned methods relieve the overfitting problem and improve the performance of neural networks. In this paper, we proposed a novel regularization methods, namely DropFilterR, for the learning of CNNs. The basic idea of DropFilterR is to relax the rule of weight-sharing in CNNs by randomly drop elements in convolution filters. Specifically, we drop different elements in convolution filters along with their moving on input feature maps. Moreover, we may apply random drop rate to further increase the randomness of the proposed method. Also, we find a suitable way to accelerate the computation for DropFilterR based on theoretical analysis. Experimental results on several widely-used image databases such as MNIST, CIFAR-10 and Pascal VOC 2012 show that using DropFilterR improves performance on image classification tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.