Abstract

Microscopic examination is commonly employed to assess edible fungal mycelium vitality. However, this method can become time-intensive when evaluating a substantial volume of hyphae samples, which implies an urgent need to develop an accurate and automatic determination method. The challenges of mycelium detection come mostly from the multi-scale target detection under various magnifications. In this study, microscopic images of 10 edible fungi strains under different magnification scales or stain colors were collected to create a dataset. An improved multi-scale object detection model for mycelium vitality detection, CCHA YOLO, was proposed by enhancing the Backbone via combining Yolov8m and Swin Transformer (SwinT). Meanwhile, the Convolutional Block Attention Module (CBAM) was introduced to the Head, as well as optimized post-processing techniques to further promote model performance. The results indicated that CCHA YOLO achieved a mAP50:95 (mean average precision) of 89.02 % with a computational load of 98.61 GFLOPs. Additionally, it indicates a 16.67 % accuracy enhancement, needing only 11.3 more computational operations compared to the baseline YOLOv8m. In the meantime, CCHA YOLO was deployed on the web-based edge to facilitate the detection of microscopic images, highlighting the practical applicability of CCHA YOLO in determining mycelium vitality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call