Abstract

Image-based automatic detection systems illustrate the efficient capability of the pest ( e.g. , sugarcane aphids) identification. However, most of the algorithms were developed using the images captured under stable light conditions and the natural illumination has been rarely considered. Here, we developed a fusion superpixel method and compared it with eight other state-of-the-art superpixel algorithms on a developed dataset of images under various natural light conditions of strong light, diffuse light, weak light, and direct sunlight. We evaluated the algorithm performance according to visual quality, ground truth deviation ( e.g. , target compactness, regularity index, boundary recall, and under-segmentation error), and segmentation runtime and quality. We found 400 as the best number of superpixels for segmenting the pest image with the resolution of 408 × 306 pixels and captured with the distance of ∼0.2 m between target and camera lens. Compared with others, the developed fusion algorithm illustrated a relatively better performance in target compactness (0.61), boundary recall (0.76), regularity index (16.4), and runtime (0.13 s). Furthermore, diffuse light was the ideal light condition in the sugarcane aphids’ identification task. Our study suggested the developed algorithm as an appropriate method concerning various natural light conditions and pest distribution in the field. • A Holistically-Nested Edge was introduced in superpixel for image segmentation. • The proposed superpixel algorithm performed better compared with previous methods. • Diffuse light was the ideal light condition in pest identification task. • K = 400 is the best number of superpixels for segmenting our sugarcane aphid images.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.