Abstract

In the autonomous city explorer (ACE) project a mobile robot is developed, which is capable of finding its way to a given destination in an unknown urban environment. An exemplary mission is to find the way from our institute to the Marienplatz, a public place in the center of Munich, without any prior knowledge or GPS information. Inspired by the behavior of humans in unknown environments, ACE must find its way by asking pedestrians. The route is about 1.5 kilometers far and includes heavily traveled roads and crowded public places. In order to navigate safely in an unknown urban environment, some challenges arise for the vision system. Robust human detection, tracking and the estimation of human body poses is essential for natural interaction with pedestrians. Furthermore, the robot needs to be able to detect sidewalk and crossroads. A visual odometry system is used to support the conventional navigation. Outdoor experiments were conducted twice successfully. After about 5 hours and interacting with 25 and 38 persons respectively, ACE arrived the Marienplatz. This paper describes both, an architecture of the vision system used for ACE and the algorithms used to deal with the described challenges.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call