Abstract
In this article, the arbitrary‐oriented object detection problem with application in robotic grasping is addressed. A novel Jensen–Shannon divergence (JSD)– You Only Look Once (YOLO) model is proposed, which enables real‐time grasp detection with high performance. The one‐stage object detection network YOLOv5 is modified with a decoupled head, which solves the angle classification problem and rectangle parameter regression problem separately, such that the YOLOv5 network is applicable for robotic grasping and the detection accuracy is significantly improved. A circular smooth label angle classification method is proposed to tackle the boundary discontinuity problem in angle regression, and the periodicity of the angle prediction is guaranteed. A novel Jensen–Shannon intersection of union is designed to calculate the intersection over union of oriented rectangles, which aims to better measure the discrepancies between the prediction and the ground truth and to avoid the singularity problem when two rectangles are not overlapped. Extensive evaluation on the Cornell and visual manipulation relationship dataset datasets demonstrates the effectiveness of the JSD–YOLO model in general robotic grasp operations, with 99.7% and 95.7% image‐wise split accuracy, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.