Abstract

AbstractFor a robot, recognizing its location in a real environment is important in terms of visual navigation, etc. This paper proposes a memory‐based self‐localization method based on the use of omnidirectional images. In the proposed method, information that is constant relative to rotation around the axis of an omnidirectional image sensor is first extracted by generating autocorrelation images from omnidirectional images containing global information that is useful for self‐localization. Next, eigenspaces are formed from the generated autocorrelation images, and self‐localization is performed by searching for the learning images that are closest to the input image in the eigenspace. Additionally, performing self‐localization in two stages (general and local) results in robust self‐localization. Lastly, experiments were conducted using time‐series omnidirectional images that were actually captured both indoors and outdoors to verify the effectiveness of the proposed method. © 2003 Wiley Periodicals, Inc. Syst Comp Jpn, 34(5): 56–68, 2003; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.1203

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.