Abstract

To solve the low accuracy of image feature matching in horticultural robot visual navigation, an innovative and effective image feature matching algorithm was proposed combining the improved Oriented FAST and Rotated BRIEF (ORB) and Lucas–Kanade (LK) optical flow algorithm. First, image feature points were extracted according to the adaptive threshold calculated using the Michelson contrast. Then, the extracted feature points were uniformed by the quadtree structure, which can reduce the calculated amount of feature matching, and the uniform ORB feature points were roughly matched to estimate the position of the feature points in the matched image using the improved LK optical flow. Finally, the Hamming distance between rough matching points was calculated for precise matching. Feature extraction and matching experiments were performed in four typical scenes: normal light, low light, high texture, and low texture. Compared with the traditional algorithm, the uniformity and accuracy of the feature points extracted by the proposed algorithm were enhanced by 0.22 and 50.47%, respectively. Meanwhile, the results revealed that the matching accuracy of the proposed algorithm increased by 14.59%, whereas the matching time and total time decreased by 39.18% and 44.79%, respectively. The proposed algorithm shows great potential for application in the visual simultaneous localization and mapping (V-SLAM) of horticultural robots to achieve higher accuracy of real-time positioning and map construction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call