Abstract

This paper proposes a stereo-vision-based method that detects and registers the positions and postures of muti-type, randomly placed miniature circuit breaker (MCB) components within scene point clouds acquired by a 3D stereo camera. The method is designed to be utilized in the flexible assembly of MCBs to improve the precision of gripping small-sized and complex-structured components. The proposed method contains the following stages: First, the 3D computer-aided design (CAD) models of the components are converted to surface point cloud models by voxel down-sampling to form matching templates. Second, the scene point cloud is filtered, clustered, and segmented to obtain candidate-matching regions. Third, point cloud features are extracted by Intrinsic Shape Signatures (ISSs) from the templates and the candidate-matching regions and described by Fast Point Feature Histogram (FPFH). We apply Sample Consensus Initial Alignment (SAC-IA) to the extracted features to obtain a rough matching. Fourth, fine registration is performed by employing Iterative Closest Points (ICPs) with a K-dimensional Tree (KD-tree) between the templates and the roughly matched targets. Meanwhile, Random Sample Consensus (RANSAC), which effectively solves the local optimal problem in the classic ICP algorithm, is employed to remove the incorrectly matching point pairs for further precision improvement. The experimental results show that the proposed method achieves spatial positioning errors smaller than 0.2 mm and postural estimation errors smaller than 0.5°. The precision and efficiency meet the requirements of the robotic flexible assembly for MCBs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call