Aiming at the complexity and poor adaptability of the calibration process in the traditional unmanned aerial vehicles (UAV) indoor visual positioning, this paper proposes an omnidirectional spatial tracking and localization method for indoor UAV based on the two-axis rotary table. Firstly, the position of the UAV fuselage feature points in the camera coordinate system of the turntable camera is computed by the Pespective-n-Point algorithm utilizing known position information of a plurality of feature points with pixel coordinate information in the corresponding image. Pixel coordinate information is extracted by obtaining UAV body-specific feature points from rotary table camera shots. Then, UAV localization in omnidirectional space can be obtained by using the calibrated rotary axis parameters of the rotary table and the rotation angle of the rotary table and substituting them into Rodriguez’s formula to unify the UAV position information acquired by the rotary table camera at different positions into a unified coordinate system. Finally, the angle at which the rotary table should rotate is calculated from the obtained UAV pose and the spatial position of the camera optical center and the rotary axis of the rotary table. The calculated angle is fed back to the turntable as feedback information. The rotary table receiving the feedback information is rotated to a position where the UAV is located at the center of the camera image. Thereby the tracking and localization of the UAV is realized. The experimental results show that the spatial range of localization is greatly expanded with the localization accuracy reaching the level of binocular visual localization. The omnidirectional spatial tracking and localization of indoor UAV can be conveniently realized by this method.
Read full abstract