Abstract
Mobile robot localization in the GPS denied environments is increasingly exerting fundamental roles in a wide range of applications such as SFM and SLAM. However, the traditional single sensor based positioning methods are either unreliable or inaccurate in the long term. This paper presents a novel moving agent localizing approach that combines both RGBD cues and wheel odometry measurements within the particle filter based probabilistic framework. Unlike the traditional RGBD localization methods which are computationally expensive and non-robust, we took advantage of wheel odomery measurements as the prior information or say the initial values during the RBGD pose optimization process. Additionally, the optimal pose derived from visual sensor is, in turn, able to determine the reliability of the wheel odometry inputs. This verifying process is considerably useful in the presence of wheel slip. Experimental results validate that our approach is effective and reliable in wheel robot localization. Introduction Mobile robot localization determines the process of the location in unknown environments and it is the core toward the realization of automatic mobile robot navigation ability. In recent years, the use of particle filter algorithm [1, 2] has become a hot topic in the robot autonomous positioning. With respect to various types of sensors, positioning methods can be divided into discrepant categories such as the visual based, laser based, wheel based, etc. At present, sensors, such as odometer sensor, ultrasonic sensor, laser sensor and visual sensor, are widely used. Ultrasonic sensor and laser sensor due to their single sensing mode and long induction period have basically been taken as auxiliary positioning sensors; odometer is primarily designed on the basis of the wheeled mobile robot localization; it estimates the moving agent distance process by means of incrementally integrating the wheel encoder data in some special circumstances (smooth ground, topography inequality). However, the encoder values obtained are by and large inaccurate, and the positioning errors could accumulate in the long run. In this paper, the depth and visual cues are adopted. Complementary to RGB sensor which can uniquely provide the world point color information, depth sensor is able to sense the additional depth information. Combining color and depth cues, the feature point extraction time is significantly reduced. But compared with the extraction information speed, odometer is still slightly fasted. For better use of the information awareness by the sensor, the target position of the mobile robot accurately estimate under the indoor environment, using particle filtering fusion odometer and depth perception of environmental information by visual sensor can realize the independent position of mobile robot, and is verified by experiments[3, 4, 5]. Sensor Model Odometer Model and Positioning Principle. This paper chooses two wheels differential driving wheeled mobile robot. Odometer sensor periodic read pulse number of photoelectric encoder, which is installed on the motor shaft. According to each read pulse, it can identify the current position of robot through distance and angle of the pulse [6, 7]. Odometer can detect driving wheel changed angle within some time through the photoelectric encoder be installed on the motor shaft. Assume that the driving wheel radius are r , the resolution 6th International Conference on Electronic, Mechanical, Information and Management (EMIM 2016) © 2016. The authors Published by Atlantis Press 1523 of encoder is w line, the reduction ratio of deceleration motor is sc i , encoder output pulse n in t time for the wheel moving distance of ds : sc wi nr ds 2 (1) Assume that the mobile robot driving wheel of left and right moving distance L ds and R ds in t time respectively. And the driving wheel spacing distance is l , thus robot walk the distance and angle in t time as shown below:
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.