Abstract

This paper proposes a method for recognizing the origin of Saposhnikovia divaricata using the IResNet model to achieve computer vision-based classification. Firstly, we created a small sample dataset and applied data augmentation techniques to enhance its diversity. After that, we introduced the hierarchical residual connection block in the early stage of the original model to expand the perceptual field of the neural network and enhance the extraction of scale features. Meanwhile, a depth-separable convolution operation was adopted in the later stage of the model to replace the conventional convolution operation and further reduce the time cost of the model. The experimental results demonstrate that the improved network model achieved a 5.03% improvement in accuracy compared to the original model while also significantly reducing the number of parameters required for the model. In our experiments, we compared the accuracy of the proposed model with several classical convolutional neural network models, including ResNet50, Resnest50, Res2net50, RepVggNet_B0, and ConvNext_T. The results showed that our proposed model achieved an accuracy of 93.72%, which outperformed ResNet50 (86.68%), Resnest50 (89.38%), Res2net50 (91.83%), RepVggNet_B0 (88.68%), and ConvNext_T (92.18%). Furthermore, our proposed model achieved the highest accuracy among the compared models, with a transmission frame rate of 158.9 fps and an inference time of only 6.29 ms. The research methodology employed in this study has demonstrated the ability to reduce potential errors caused by manual observation, effectively improving the recognition ability of Saposhnikovia divaricata based on existing data. Furthermore, the findings of this study provide valuable reference and support for future efforts to develop lightweight models in this area.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.