Abstract

Autonomous elevator button operation is an indispensable function for robot–elevator interaction, which has long been considered an intelligent solution for multifloor navigation of mobile robots. In this article, we present an autonomous robotic system with an eye-in-hand configuration to address the button operation problem. First, we develop a deep neural network for simultaneous button detection and character recognition, which provides accurate and robust perception inputs for the button operation system. Second, we present a button pose estimation algorithm with consideration of perception uncertainties, and the button pose is accurately estimated by fitting a least-uncertainty model. Based on the perception and pose estimation algorithms, a coarse-to-fine control schema is then proposed for driving the manipulator to accomplish the button operation task. Experimental results show that the proposed perception algorithm can outperform state-of-the-art methods in both recognition accuracy and running efficiency. The pose estimation and control schema also demonstrate their effectiveness in real-world button operation tasks. The data and code are available at our project webpage .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call