Abstract

This paper proposes a new real-time target tracking method based on the open-loop monocular vision motion control. It uses the particle filter technique to predict the moving target’s position in an image. Due to the properties of the particle filter, the method can effectively master the motion behaviors of the linear and nonlinear. In addition, the method uses the simple mathematical operation to transfer the image information in the mobile target to its real coordinate information. Therefore, it requires few operating resources. Moreover, the method adopts the monocular vision approach, which is a single camera, to achieve its objective by using few hardware resources. Firstly, the method evaluates the next time’s position and size of the target in an image. Later, the real position of the objective corresponding to the obtained information is predicted. At last, the mobile robot should be controlled in the center of the camera’s vision. The paper conducts the tracking test to the L-type and the S-type and compares with the Kalman filtering method. The experimental results show that the method achieves a better tracking effect in the L-shape experiment, and its effect is superior to the Kalman filter technique in the L-type or S-type tracking experiment.

Highlights

  • The target tracking is always a popular field of study, and the motion state of its tracking objects is usually nonlinear and non-Gaussian

  • This paper proposes a new real-time target tracking method based on the open-loop monocular vision motion control

  • The paper proposes a real-time target tracking method based on the open-loop monocular vision mobile control

Read more

Summary

Introduction

The target tracking is always a popular field of study, and the motion state of its tracking objects is usually nonlinear and non-Gaussian. All possible states in the time should be predicted, and the similarity both in all possible states and in the original states should be measured so that the similarity in all states can obtain a group of state values again In this way, the most similar state can be adopted to calculate the actual depth, the horizontal distance and angle of the targets can be calculated, and the movement of the mobile robot can be controlled. Combining with the target tracking, the target location, and the movement controlling rule for the movement robot, the actual distance between the tracking target and camera can be obtained through the position and the size of the target in the image and the set tracking target. The motion controlling rule controls the mobile robots to obtain the tracking objective

Software and Hardware Specification
Experimental Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call