Abstract
3D Object tracking is the task of capturing the 3D position and pose of an object from each time-series image frame. As we known, 3D sensing technique can be realized by stereo vision, structured light, and ToF (time-of-flight) camera. All of them can capture the point cloud data for describing the depth information in a workspace. In past research, the reliability in 3D object tracking was a big problem for real industrial application, therefore, we address a different way to enhance the tracking accuracy and stabilize the tracing path for raising the reliability. In order to build a 3D tracking model and the workspace environment, we adopted an RGB-D camera which is the Intel® RealSense™ D400 Series depth modules to collect the cloud point data and RGB values. The built 3D tracking model should contains the information which includes points, normal and texture for producing many 2D object images with different perspectives. Then the produced images were fed to a SSD (single-shot detector) neural network to learn the object’s features for 2D tracking. In dynamic tracking process, the image frames were through the sematic image segmentation by DeepLabV3+ for only extracting the object information without hands and background. Thus, the reserved data only included object’s cloud point data and texture information in workspace. Then we use the iterative closest point (ICP) algorithm and the RGB intensity correlation method to confirm the object’s position and posture in workspace. The result shows that our method has a better performance than SSD method for tracking a self-predefined object.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IOP Conference Series: Materials Science and Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.