Abstract

A multi-camera network is proposed to estimate an aircraft’s motion parameters relative to the reference platform in large outdoor fields. Multiple cameras are arranged to cover the aircraft’s large-scale motion spaces by field stitching. A camera calibration method using dynamic control points created by a multirotor unmanned aerial vehicle is presented under the conditions that the field of view of the cameras is void. The relative deformation of the camera network caused by external environmental factors is measured and compensated using a combination of cameras and laser rangefinders. A series of field experiments have been carried out using a fixed-wing aircraft without artificial makers, and its accuracy is evaluated using an onboard Differential Global Positioning System. The experimental results show that the multi-camera network is precise, robust, and highly dynamic and can improve the aircraft’s landing accuracy.

Highlights

  • Estimating an aircraft’s motion parameters is an important application of multi-camera networks,[1] such as vision-based aircraft landing.[2,3] It is a basic requirement of flight tests to measure the aircraft’s distance, azimuth, velocity, and pose relative to the reference platform on which it will land, so that the aircraft can land universally and dynamically with high precision, even when the platform is moving.[4,5]There are a number of motion parameter measurement systems for aircrafts currently in service

  • The multi-camera network performs as expected with the distance error decreasing when the aircraft approaches, and the distance error is significantly decreased in the process of entering the near-field from middle-field, because the calibration accuracy of the cameras in near-field (C5 À C6) is higher than the cameras in middle-field and far-field (C1 À C4), in view of adopting different calibration methods with the control points measured by a total station and dynamic control points created by a unmanned aerial vehicle (UAV), respectively

  • A calibration method for the camera for which the field of view (FOV) is void is presented, by using the dynamic control points created by a multirotor UAV, and the cameras combined with laser rangefinder (LRF) are adopted to measure the relative deformation between the measurement unit on the left side and the other one on the right side

Read more

Summary

Introduction

Estimating an aircraft’s motion parameters is an important application of multi-camera networks,[1] such as vision-based aircraft landing.[2,3] It is a basic requirement of flight tests to measure the aircraft’s distance, azimuth, velocity, and pose relative to the reference platform on which it will land, so that the aircraft can land universally and dynamically with high precision, even when the platform is moving.[4,5]. The workflow of our multi-camera network roughly divides into three phases: (1) Calibrate the camera parameters (C1 À C8) in the calibration coordinate system OW À XW YW ZW ; (2) measure and compensate the relative deformation between the left and the right measurement unit in the platform; (3) calculate the motion parameters of the aircraft relative to the platform coordinate system OP À XPYPZP. The homogeneous transformation H780 between the camera coordinate systems of C7 and C80 is calculated using the minimum of the sum of the reprojection errors of the control points in two measurement units as objective function, under the distance constraint provided by two LRFs, as follows.

Experiments and results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.