Abstract

Insects are the most common type of animal on earth. In agriculture, insects are often referred to as pests because they are predators and parasitoids of plants. However, due to the large number of insects, it is challenging to identify the types of insects in order to determine the proper prevention. An artificial intelligence system via pest image recognition using CNN is expected to overcome this. Hence, prevention can be carried out more precisely for each insect type and thus saving costs and time. However, an efficient model with few parameters yet robust detection is required. Thus, it can be applied on mobile devices like smartphones or drones. Furthermore, a training procedure for maximizing the optimization results is required to achieve robust detection in insect classification. Insects usually appear in arbitrary positions, scales, and intensities, either with occlusion or without. The IP102 dataset is used as a benchmark for conducting training on the CNN algorithm. In this study, we propose an efficient training framework for optimizing small-sized models of MobileNetV2 using dynamic learning rate, exploiting CutMix augmentation, freezing layers, and sparse regularization. Combining those methods during training achieved the highest accuracy of 0.7132. The best hyper-parameters used during training are the AdamW optimizer configuration with the initial learning rate of 0.0001 and dropout of 0.2 on the linear layer. It outperforms several baseline models, which use a higher number of parameters.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.