Abstract
An object’s six-degree-of-freedom (6DoF) pose information has great importance in various fields. Existing methods of pose estimation usually detect two-dimensional (2D)-three-dimensional (3D) feature point pairs, and directly estimates the pose information through Perspective-n-Point (PnP) algorithms. However, this approach ignores the spatial association between pixels, making it difficult to obtain high-precision results. In order to apply pose estimation based on deep learning methods to real-world scenarios, we hope to design a method that is robust enough in more complex scenarios. Therefore, we introduce a method for 3D object pose estimation from color images based on farthest point sampling (FPS) and object 3D bounding box. This method detects the 2D projection of 3D feature points through a convolutional neural network, matches it with the 3D model of the object, and then uses the PnP algorithm to restore the feature point pair to the object pose. Due to the global nature of the bounding box, this approach can be considered effective even in partially occluded or complex environments. In addition, we propose a heatmap suppression method based on weighted coordinates to further improve the prediction accuracy of feature points and the accuracy of the PnP algorithm in solving the pose position. Compared with other algorithms, this method has higher accuracy and better robustness. Our method yielded 93.8% of the ADD(-s) metrics on the Linemod dataset and 47.7% of the ADD(-s) metrics on the Occlusion Linemod dataset. These results show that our method is more effective than existing methods in pose estimation of large objects.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.