Abstract

A unified detection-tracking-estimation vision navigation system and algorithms framework for position and pose estimation of UAV (unmanned aerial vehicle) is proposed in this paper. The major contribution of this work is a novel cooperative object detection algorithm for robust detection of UAV: the spatial color pattern and temporal flicking pattern of the cooperative markers (groups of LEDs) mounted on UAV are adaptively adjusted to make it more salient from background, hence much easier for the vision system to detect it under unconstrained environment, especially during UAV taking-off/landing phase which is typically near ground and suffer more interference from complex scene and changing of illumination. After extracting position and pose information by vision system, particle filtering is used to smooth the trajectory of UAV based on established vehicle dynamics model. Experiment results shows the position and pose measurement based on vision techniques is consistent with IMU measurement and does not show long term drifting error compared to IMU measurement.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call