Abstract

The vision tracking system in this paper estimates the robot position relative to a target and rotates the camera towards the target. To estimate the robot position of mobile robot, the system combines information from an accelerometer, a gyroscope, two encoders, and a vision sensor. The encoders can provide fairly accurate robot position information, but the encoder data are not reliable when robot wheels slip. Accelerometer data can provide the robot position information even when the wheels are slipping, but a long term position estimation is difficult, because of integration of errors arising from bias and noise. To overcome the drawbacks of each method mentioned in the above, the proposed system uses data fusion with two Kalman filters and a slip detector. One Kalman filter is for the slip case, and the other is for the no-slip case. Each Kalman filter uses a different sensor combination for estimating the robot motion. The slip detector compares the data from the accelerometer with the data from the encoders, and decides if a slip condition has occurred. Accordingly, based on the decision of the slip detector, the system chooses one of the outputs of the two Kalman filters, which is subsequently used for calculating the camera angle of the vision tracking system. The vision tracking system is implemented on a two-wheeled robot. To evaluate the tracking and recognition performance of the implemented system, experiments are performed for various robot motion scenarios in various environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call