Abstract

The fast execution speed and energy efficiency of analog hardware have made them a strong contender for deploying deep learning models at the edge. However, there are concerns about the presence of analog noise which causes changes to the weight of the models, leading to performance degradation of deep learning models, despite their inherent noise-resistant characteristics. The effect of the popular batch normalization layer(BatchNorm) on the noise-resistant ability of deep learning model is investigated in this work. This systematic study has been carried out by first training different models with and without BatchNorm layer on the CIFAR10 and the CIFAR100 datasets. The weights of the resulting models are then injected with analog noise, and the performance of the models on the test dataset is obtained and compared. The results show that the presence of the BatchNorm layer negatively impacts the noise-resistant property of deep learning models i.e ResNet44 and VGG16 models with BatchNorm layers trained with CIFAR10 dataset has an average normalized inference accuracy of 41.32% and 10.75% respectively compared to 91.95% and 93.80% obtained for same ResNet44 and VGG16 model without the BatchNorm layer respectively . Furthermore, the impact of the BatchNorm layer also grows with the increase of the number of BatchNorm layers.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.