Abstract
Odometry is commonly used in localization applications especially with wheeled platforms since encoders are readily available. It is often used by itself or fused with other sensor data to obtain a better estimate. However, its limitation is its exclusivity to wheeled platforms whereas it is often desired to have similar encoder odometry options on other systems. Given that WiFi is ubiquitous in most commercial and industrial areas, in this paper, a method is proposed for obtaining odometry from WiFi scans for position estimation. The method is not constrained to wheel robots such as the case for wheeled odometry and does not rely on the traditional fingerprinting method. The proposed method involves training a neural network model to predict the distance moved based on features extracted from WiFi scans in the environment. These distances moved are then summed up to obtain the trajectory. Experiments are conducted and the methods are evaluated based on Root Mean Square Error (RMSE). Experimental results showed that the proposed method is able to achieve an RMSE of at most 8.39m for the various test cases. <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Note to Practitioners</i> —This paper was motivated by the limited sensors available for odometry. Existing methods of odometry either require a wheeled platform or exteroceptive sensors to be placed outside of the robot so that it can see the environment. This paper proposes a new and low-cost method of performing odometry using a WiFi receiver and Inertial Measurement Unit (IMU) with a neural network model. This provides an alternative that exploits existing WiFi infrastructure and thus more flexibility in robot design without wheels and sensor placement constraints. We show how the features are selected as well as propose several similarity methods to choose from. We then show how the neural network model is trained and used during implementation. Preliminary physical experiments suggest that the method was able to obtain the trajectory of a robot in two different environments using the same model and different speeds.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Automation Science and Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.