Abstract

AbstractDropout has been introduced as a powerful regularization approach to preventing overfitting problem in deep neural networks, particularly in deep convolutional neural networks (DCNNs). Recently, Various approaches of Dropout extension have been designed to avoid overfitting problems, as well as to enhance the generalization capability of deep neural networks. These methods include spectral dropout which achieves improved and avoids overfitting by eliminating noisy and weak Fourier domain coefficients of the neural network activations. Although spectral dropout has shown powerful results on using it in different convolutional neural network layers, its actual effect on the pooling layer has not been thoroughly explored. Furthermore, a pooling process plays a crucial role in deep convolutional neural networks, which serves to reduce the dimensionality of processed data for decreasing computational cost as well as for avoiding overfitting and enhancing the generalization capability of the network. For this reason, we focus on the pooling layer, and we propose a novel pooling method based on applying the Spectral dropout technique in the pooling region, in order to avoid overfitting problem, as well as to enhance the generalization ability of DCNNs. Experimental results on several image benchmarks show that our proposed technique outperforms the existing pooling methods in classification performance as well as is effective for improving the generalization ability of DCNNs. Moreover, we show that our proposed technique combined with other regularization methods, such as batch normalization, is competitive with other existing methods in classification performance.KeywordsDeep Neural NetworksConvolutional Neural NetworksPooling methodsGeneralization abilityRegularization methodsSpectral dropout

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call