Abstract

Odometry as one of the inevitable parts of robot behavior control plays an essential role in the localization of humanoid robots. Calculating odometry on the humanoid robots is usually based on one or combination of the vision, laser scanner, magnetic sensor, and pressure sensors. Vision or laser scanner-based approaches require high computational power for analyzing visual information. Hence, these kinds of approaches are not suitable for all kind of the small humanoid robots. On the other hand, it is known that magnetic sensors have instability problems in different environments. Furthermore, calculating accurate dead reckoning (pure odometry) is very difficult because of the complex mechanical system of humanoid robots, the presence of many sources of uncertainty, and inaccuracy in motion executions such as foot slippage. Therefore, this paper presents a robust learning method to localize a humanoid robot named Lightweight Humanoid robot Odometric Learning method (LHOL). This method does not employ any vision, magnetic, additional pressure sensors, and laser scanners, and, therefore, eliminates dependencies to these sensors for the first time. The method's core learning is based on the artificial neural network (ANN) which uses kinematic computations, IMU (roll and pitch data), and the robot's actuators’ internal present load data as input data. The proposed LHOL method proves high accuracy on a novel fully 3D printed kid-sized humanoid robot platform (ARC) with both open-loop and closed-loop walk engines at differently covered floors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call