Abstract

Deep learning (DL) has demonstrated exceptional success across various domains, including computer vision, natural language processing, and speech recognition. However, the training and inference processes of DL models typically require substantial computational resources and storage space, presenting a significant challenge within the Internet of Things (IoT) domain. This study contributes theoretically to the field of lightweight DL by proposing L-Net, a lightweight convolutional neural network designed specifically for low-compute devices. The L-Net addresses challenges associated with channel interaction disparities and vanishing gradients. To further improve the network performance, we introduce the residual enhanced channel attention (or R-ECA) module, which combines a bypass mechanism derived from simplified residual learning with the attention mechanism's cross-channel interaction. Additionally, we replace the rectified linear unit function (or ReLU) with an exponential linear unit (or ELU) function to enhance the network's nonlinear expression capability and training speed. We conducted object recognition experiments and compared the accuracy and prediction stability of L-Net with well-known models, such as AlexNet, VGG11, SqueezeNet, ResNet, and MobileNet, to assess its efficacy. Using the CIFAR-10 dataset and our custom dataset of apple tree leaf diseases, our experimental results demonstrate that, with relatively smaller model parameters, L-Net performs exceptionally well in terms of mean Average Precision (mAP), achieving 0.906. Furthermore, when applied to our custom dataset, L-Net exhibits relatively consistent performance across various dataset splits under different ratios, outperforming the majority of models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call