Abstract

Abstract. Unmanned aerial vehicles (UAVs) equipped with integrated global navigation satellite systems/inertial navigation systems (GNSS/INS) together with RGB and hyperspectral (HS) cameras have become popular data acquisition tool for several applications. To derive accurately georeferenced products from such systems, the spatial and rotational offset between the onboard sensors and the GNSS/INS unit must be determined. While the spatial relationship can be measured manually, establishing the angular offset – i.e., boresight angles – is more challenging and requires a calibration strategy. Given that the majority of RGB cameras are based on frame imaging mechanism, boresight calibration of these sensors is usually conducted through automated triangulation of overlapping images. On the other hand, most of current HS cameras are based on push-broom technology – also known as line cameras – which capture 1D images at a time. Consequently, automated triangulation of these non-overlapping images is not possible and thus, current boresight calibration strategies for HS cameras are based on calibration missions where special targets are deployed in the study site and are manually measured in the imagery. Although boresight calibration missions can lead to accurate system calibration parameters, they are expensive and labor intensive activities. To address these limitations and motivated by the new trend of UAVs equipped with both RGB and HS cameras, this study proposes a multi-modal triangulation approach to conduct an in-situ boresight calibration for the two cameras, simultaneously. Experimental results over an agricultural field and an urban area show that the proposed approach results in orthophotos with high visual quality and geolocation accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call