Abstract

In this article, we propose the visual application of a navigation framework for a wheeled robot to disinfect surfaces. Since dynamic environments are complicated, advanced sensors are integrated into the hardware platform to enhance the navigation task. The 2D lidar UTM-30LX from Hokuyo attached to the front of the robot can cover a wide scanning area. To provide better results in laser scan matching, an inertial measurement unit was integrated into the robot’s body. The output of this combination feeds into a global costmap for monitoring and navigation. Additionally, incremental encoders that obtain high-resolution position data are connected to the rear wheels. The role of the positioning sensor is to identify the existing location of the robot in a local costmap. To detect the appearance of a human, a Kinect digital camera is fixed to the top of the robot. All feedback signals are combined in the host computer to navigate the autonomous robot. For disinfection missions, the robot must carry several ultraviolet lamps to autonomously patrol in unknown environments. To visualize the robot’s effectiveness, our approach was validated using both a virtual simulation and an experimental test. The contributions of this work are summarized as follows: (i) a structure for ultraviolet-based hardware was first established; (ii) the theoretical computations for the robot’s localization in the 3D workspace will play a fundamental role in further developments; and (iii) data fusion from advanced sensing devices was integrated to enable navigation in uncertain environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call