Abstract

This paper describes an interactive autonomous tour guide robot designed to guide visitors through Asia Pacific University Engineering Labs. Although tour guide robots with various self-localization abilities such as mapping has been introduced in the past, these technologies performance still remain challenged by indoor navigation obstacles. The current approach consists of implementing a low cost autonomous indoor tour guide robot running on an embedded system which is the Raspberry pi 2. The autonomous navigation is achieved through wall following using ultrasonic sensors and image processing using a simple webcam. The bitwise image processing comparison method introduced is writing in OpenCV and runs on the Raspberry pi. It grabs images and look for the tags to identify each lab. A recognition accuracy of 98% was attained during the navigation testing in the labs. The user interaction was achieved through voice recognition on an android tablet placed on top of the robot. Google speech recognition API's was used for the communication between the robot and the visitors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call