Abstract

Ground target three-dimensional positions measured from optical remote-sensing images taken by an unmanned aerial vehicle play an important role in related military and civil applications. The weakness of this system lies in its localization accuracy being unstable and its efficiency being low when using a single unmanned aerial vehicle. In this paper, a novel multi–unmanned aerial vehicle cooperative target localization measurement method is proposed to overcome these issues. In the target localization measurement stage, three or more unmanned aerial vehicles simultaneously observe the same ground target and acquire multiple remote-sensing images. According to the principle of perspective projection, the target point, its image point, and the camera’s optic center are collinear, and nonlinear observation equations are established. These equations are then converted to linear equations using a Taylor expansion. Robust weighted least-squares estimation is used to solve the equations with the objective function of minimizing the weighted square sum of re-projection errors from target points to multiple pairs of images, which can make the best use of the effective information and avoid interference from the observation data. An automatic calculation strategy using a weight matrix is designed, and the weight matrix and target-position coordinate value are updated in each iteration until the iteration stopping condition is satisfied. Compared with the stereo-image-pair cross-target localization method, the multi–unmanned aerial vehicle cooperative target localization method can use more observation information, which results in higher rendezvous accuracy and improved performance. Finally, the effectiveness and robustness of this method is verified by numerical simulation and flight testing. The results show that the proposed method can effectively improve the precision of the target’s localization and demonstrates great potential for providing more accurate target localization in engineering applications.

Highlights

  • In recent years, unmanned aerial vehicles (UAVs) have been successfully applied in various fields

  • In this paper, based on the principle of stereo-image-pair cross-target localization, the same target point is measured multiple times by multiple UAVs, and the optimization algorithm is According to least-squares estimation, the vector V can be estimated as follows

  • When multi-UAV cooperative target localization is used at large heights, the component on the Z axis is more sensitive to errors, and the deviation of localization results is worse

Read more

Summary

Introduction

In recent years, unmanned aerial vehicles (UAVs) have been successfully applied in various fields. In the target localization process, the UAV position (l, u, h) and attitude (f, g, u) can be provided by GPS and IMU, respectively, and the azimuth and elevation. Multi-UAV cooperative target localization based on robust weighted least-squares estimation. In this paper, based on the principle of stereo-image-pair cross-target localization, the same target point is measured multiple times by multiple UAVs, and the optimization algorithm is According to least-squares estimation, the vector V can be estimated as follows. During the process of UAV target localization, the attitude and height of the UAV and the azimuth and elevation angle of the camera are all different at each measurement point.

The weight factor vi is defined as
Experimental results and analysis
Error size
Analysis of localization results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.