Abstract

Currently, the unmanned rice harvester based on satellite navigation is affected by tracked chassis steering errors and navigation accuracy, which can result in harvesting omissions and undercutting. Therefore, we proposed a cutting width measurement method for unmanned rice harvesters based on RGB-D images. The proposed method can provide navigational aids for driving and may also provide a basis for feed quantity detection and adjustment. We designed a lightweight unharvested area segmentation model (UANet) with a mean pixel accuracy (mPA) and a mean intersection over union (mIoU) of 98.28% and 97.15%, respectively, which are both higher than the lightweight semantic segmentation models, such as DABNet, CFPNet-V2 and ENet. We implemented unharvested area segmentation and harvesting boundary points extraction on RGB images, combined with depth maps to convert the pixel coordinates of harvesting boundary points into 3D world coordinates for cutting width measurement. Through experiments on rice harvest field images under different working conditions, the results showed that the success rate of harvesting boundary points detection was 98.33%, the error of cutting width measurement was within 2.34%, and the speed of the algorithm was approximately 9.7 fps. The proposed method can be adapted to complex harvesting scenarios and has the advantages of low hardware cost, high measurement accuracy, and good robustness, which can be extended to wheat.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call