Abstract

Multiple source images acquired from diverse sensors mounted on unmanned aerial vehicles (UAVs) offer valuable complementary information for ground vegetation analysis. However, accurately aligning heterogeneous UAV images poses challenges due to differences in geometry, intensity, and noise resulting from varying imaging principles. This paper presents a two-stage registration method aimed at fusing visible RGB and multispectral images for cotton leaf lesion grading. The coarse alignment stage utilizes Scale Invariant Feature Transform (SIFT), while the refined alignment stage employs a novel correlation coefficient-based template matching. The proposed method first employs the EfficientDet network to detect infected cotton leaves with lesions in RGB images. Subsequently, lesion leaves in multiple spectral imagery (red, green, red edge, and near-infrared bands) are located using the perspective transformation matrix derived from SIFT and the coordinates of lesion leaves in RGB images. Refined registration between RGB and multispectral imagery is achieved through template matching with the new correlation coefficient. The registered reflectance data from the different spectral bands and RGB components are utilized to classify pixels in each infected leaf into lesion, healthy, and soil parts. The lesion grade is determined based on the ratio of lesion pixels to the total corresponding leaf area. Experimental results, compared with manual assessment, demonstrate a lesion leaves detection model with a mAP@0.5 of 91.01% and a leaf lesion grading accuracy of 92.01%. These results validate the suitability of the proposed method for UAV RGB and multispectral image registration, enabling automated cotton leaf lesion grading.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call