Abstract
This paper proposes methods to fuse range images and intensity images which are measured from multiple view points. Distributed sensing is a key technology for multiple robot system. As sensory information for the robot system, range image and intensity image are both useful and complementary, and thus fusion of the two images is thought to be effective. In this paper, each robot is assumed to have both range image sensor and intensity image sensor, and measures planar regions, 3D edges, cylindrical regions by fusing a range image and an intensity image. Methods to fuse such features which are measured from multiple view points by multiple robots are proposed. They are formulated by the least square approach, considering the errors of position and orientation of each robot and the errors of images. Experiments are performed to show the effectiveness of the proposed fusion methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.