Abstract

Fungal rot is the most serious defect in post-harvest citrus, and the timely detection and removal of early rotten citrus are particularly crucial in reducing economic losses. Imaging technology under structured illumination is a promising method to fruit surface defects. This study aims to assess the feasibility of combining the structured-illumination reflectance imaging (SIRI) system with deep learning technology for the identification and segmentation of early rot in citrus. Phase-shifted images of oranges were acquired at four spatial frequencies and demodulated to obtain direct component (DC) and amplitude component (AC) images. The optimal spatial frequency for detecting rot was determined to be 0.20 cycles mm−1 based on the contrast index between the decay and sound areas in the images. Then, the AC images are subjected to brightness correction and augmentation. Three segmentation methods, global thresholding, watershed segmentation and Unet were used to segment the rotten areas in the images. Unet achieved optimal results on AC images, with an overall accuracy of 99.4 % and an IoU of 0.903. The gradient-weighted class activation mapping (Grad-CAM) was used to visualize the areas recognized by Unet for orange rot, yielding satisfactory results. This study effectively demonstrated the defect recognition capability of SIRI combined with deep learning and providing a reliable solution for early decayed detection in oranges.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.