Abstract

There are two kinds of depth perception for robot vision systems: quantitative and qualitative. The first one can be used to reconstruct the visible surfaces numerically while the second to describe the visible surfaces qualitatively. In this paper, we present a qualitative vision system suitable for intelligent robots. The goal of such a system is to perceive depth information qualitatively using monocular 2-D images. We first establish a set of propositions relating depth information, such as 3-D orientation and distance, to the changes of image region caused by camera motion. We then introduce an approximation-based visual tracking system. Given an object, the tracking system tracks its image while moving the camera in a way dependent upon the particular depth property to be perceived. Checking the data generated by the tracking system with our propositions provides us the depth information about the object. The visual tracking system can track image regions in real-time even as implemented on a PC AT clone machine, and mobile robots can naturally provide the inputs to our visual tracking system, therefore, we are able to construct a real-time, cost effective, monocular, qualitative and 3-dimensional robot vision system. To verify our idea, we present examples of perception of planar surface orientation, distance, size, dimensionality and convexity/concavity.© (1992) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.