Abstract

A method is presented for efficient and reliable object recognition within noisy, cluttered, and occluded range images. The method is based on a strategy which hypothesizes the intersection of the object with some selected image point, and searches for additional surface data at locations relative to that point. At each increment, the image is queried for the existence of surface data at a specific spatial location, and the set of possible object poses is further restricted. Eventually, either the object is identified and localized, or the initial hypothesis is refuted. The strategy is implemented in the discrete domain as a binary decision tree classifier. The tree leaf nodes represent individual voxel templates of the model. The internal tree nodes represent the union of the templates of their descendant leaf nodes. The union of all leaf node templates is the complete template set of the model over its discrete pose space. Each internal node also references a single voxel which is the most common element of its child node templates. Traversing the tree is equivalent to efficiently matching the large set of templates at a selected image seed location. The process is approximately 3 orders of magnitude more efficient than brute-force template matching. Experimental results are presented in which objects are reliably recognized and localized in 6 dimensions in less than 60 seconds within noisy and significantly occluded range images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.