Abstract

Self-normalizing neural networks (SNN) regulate the activation and gradient flows through activation functions with the self-normalization property. As SNNs do not rely on norms computed from minibatches, they are more friendly to data parallelism, kernel fusion, and emerging architectures such as ReRAM-based accelerators. However, existing SNNs have mainly demonstrated their effectiveness on toy datasets and fall short in accuracy when dealing with large-scale tasks like ImageNet. They lack the strong normalization, regularization, and expression power required for wider, deeper models and larger-scale tasks. To enhance the normalization strength, this paper introduces a comprehensive and practical definition of the self-normalization property in terms of the stability and attractiveness of the statistical fixed points. It is comprehensive as it jointly considers all the fixed points used by existing studies: the first and second moment of forward activation and the expected Frobenius norm of backward gradient. The practicality comes from the analytical equations provided by our paper to assess the stability and attractiveness of each fixed point, which are derived from theoretical analysis of the forward and backward signals. The proposed definition is applied to a meta activation function inspired by prior research, leading to a stronger self-normalizing activation function named “bi-scaled exponential linear unit with backward standardized” (bSELU-BSTD). We provide both theoretical and empirical evidence to show that it is superior to existing studies. To enhance the regularization and expression power, we further propose scaled-Mixup and channel-wise scale & shift. With these three techniques, our approach achieves 75.23% top-1 accuracy on the ImageNet with Conv MobileNet V1, surpassing the performance of existing self-normalizing activation functions. To the best of our knowledge, this is the first SNN that achieves comparable accuracy to batch normalization on ImageNet.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call