Abstract

This paper presents an efficient stereovision-based motion compensation method for moving robots. The vision system of a moving robot enables it to detect and localize known objects in the images obtained from the camera mounted in its head. However, ego-motion causes some errors that need to be eliminated. We therefore propose an ego-motion compensation method that eliminates the errors in environment recognition caused by the ego-motion of a moving robot, in addition to efficiently improving recognition accuracy. The proposed method uses the disparity map obtained from three-dimensional (3D) vision and can be divided into three modules: segmentation, feature extraction, and estimation. In the segmentation module, we propose the use of extended type-2 fuzzy information theory (ET2FIT) to extract the objects. The results of using ET2FIT are then compared and analyzed to those obtained using type-2 fuzzy set and normal type-1 fuzzy set. The conventional fuzzy information theory [11] can only be applied to the binary image case. Therefore, we need to modify the existing method.In the feature extraction module, features are extracted using wavelet level-set transform, and least-square ellipse approximation is used in the estimation module to calculate the displacement for the rotation and translation between image sequences. The results of experiments indicate that our proposed method is highly effective when applied to moving robots, especially humanoid robots walking and operating in the real world.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.