Abstract

Batch Normalization (BN) is a commonly employed regularization technique for deep neural networks. This technique employs normalization and affine transformation to accelerate the training phase. The normalization process forces the distribution of layer inputs into the standard normal distribution in a mini-batch, resolving the internal covariate shift problem. Meanwhile, the affine transformation preserves the network’s ability to perform non-linear feature transformations. However, the effectiveness of BN can be limited when dealing with small mini-batch sizes, since the batch statistics estimated from inadequate samples are inaccurate and unreliable. To address this issue, we present Noise-Assisted Batch Normalization (NABN), which serves as a variant of BN. The proposed method adds random noise into the mean and variance calculated from the mini-batch during the normalization process, enhancing the diversity of mean and variance. We evaluate the effectiveness of our NABN for image classification on CIFAR-10, retinal OCT, and chest X-ray datasets with various convolutional network architectures such as ResNet-20, ResNet-32, ResNet-44, and ResNet-50. Furthermore, experimental results demonstrate the superiority of our proposed approach over the traditional BN for medical image segmentation using U-Net, as evaluated on the MSD liver dataset. Code is available at https://github.com/ROSENty/NABN.git.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call