Abstract
In many industrial applications, it is possible to approximate the shape of mechanical parts with geometric primitives such as spheres, boxes, and cylinders. This information can be used to plan robotic grasping and manipulation procedures. The work presented in this paper investigated the use of the state-of-the-art PointNet deep neural network for primitive shape recognition in 3D scans of real-life objects. To obviate the need of collecting a large set of training models, it was decided to train PointNet using examples generated from artificial geometric models. The motivation of the study was the achievement of fully automated disassembly operations in remanufacturing applications. PointNet was chosen due to its suitability to process 3D models, and ability to recognise objects irrespective of their poses. The use of simpler shallow neural network procedures was also evaluated. Twenty-eight point cloud scenes of everyday objects selected from the popular Yale-CMU-Berkeley benchmark model set were used in the experiments. Experimental evidence showed that PointNet is able to generalise the knowledge gained on artificial shapes, to recognise shapes in ordinary objects with reasonable accuracy. However, the experiments showed some limitations in this ability of generalisation, in terms of average accuracy (78% circa) and consistency of the learning procedure. Using a feature extraction procedure, a multi-layer-perceptron architecture was able to achieve nearly 83% classification accuracy. A practical solution was proposed to improve PointNet generalisation capabilities: by training the neural network using an error-corrupted scene, its accuracy could be raised to nearly 86%, and the consistency of the learning results was visibly improved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: The International Journal of Advanced Manufacturing Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.