Abstract
The demand for object pose estimation is steadily increasing, and deep learning has propelled the advancement of this field. However, the majority of research endeavors face challenges in their applicability to industrial production. This is primarily due to the high cost of annotating 3D data, which places higher demands on the generalization capabilities of neural network models. Additionally, existing methods struggle to handle the abundance of textureless objects commonly found in industrial settings. Finally, there is a strong demand for real-time processing capabilities in industrial production processes. Therefore, in this study, we introduced a dual-channel Siamese framework to address these challenges in industrial applications. The architecture employs a Siamese structure for template matching, enabling it to learn the matching capability between the templates constructed from high-fidelity simulated data and real-world scenes. This capacity satisfies the requirements for generalization to unseen objects. Building upon this, we utilized two feature extraction channels to separately process RGB and depth information, addressing the limited feature issue associated with textureless objects. Through our experiments, we demonstrated that this architecture effectively estimates the three-dimensional pose of objects, achieving a 6.0% to 10.9% improvement compared to the state-of-the-art methods, while exhibiting robust generalization and real-time processing capabilities.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.