Abstract

With the increased model size of convolutional neural networks (CNNs), overfitting has become the main bottleneck to further improve the performance of networks. Currently, the weighting regularization methods have been proposed to address the overfitting problem and they perform satisfactorily. Since these regularization methods cannot be used in all the networks and they are usually not flexible enough in different phases of the training and test processes, this article proposes a multiscale conditional (MSC) regularization method. MSC divides the intermediate features into different scales and then generates new data for each scale features, respectively. In addition, the new data are generated by employing the information from two conditions: 1) each sample feature and 2) each layer pattern. Finally, a self-identity structure is proposed to supplement the features with the generated data. Therefore, MSC can adaptively and efficiently generate much finer and individualized data to make the entire regularization more flexible. Furthermore, MSC is more general and can be applied to all kinds of networks through the proposed self-identity structure. The experimental results on all the benchmark datasets showed that the proposed MSC regularization method achieves the best performances in all the networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call