Abstract
In this paper, we propose an Embedded Real-time Monocular SLAM (Simultaneous Localization and Mapping) System for an autonomous indoor mobile robot. Autonomous mobile robots must be able to estimate and maintain the pose of the robot and the map of the environment at the same time. SLAM performs those tasks using one or more external sensors (e.g., LiDAR, Camera, and Inertial Measurement Unit). The previous SLAM system had problems with a sensor size, high power consumption, and high cost. Thus, it is hard to implement on a small indoor robot. We propose an Embedded (small size, low power consumption, and low cost) Real-time Monocular SLAM System which combines an ORB feature extraction-based SLAM (ORB-SLAM), a monocular camera, and a dynamically reconfigurable processor (DRP). This system realizes real-time (30 fps over) and low-power (less than 2 W) SLAM utilizing the hardware accelerating function of DRP. In the future, we will examine the speed-up of all processing and build it into a device.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have