Abstract

Multi- and hyperspectral cameras on drones can be valuable tools in environmental monitoring. A significant shortcoming complicating their usage in quantitative remote sensing applications is insufficient robust radiometric calibration methods. In a direct reflectance transformation method, the drone is equipped with a camera and an irradiance sensor, allowing transformation of image pixel values to reflectance factors without ground reference data. This method requires the sensors to be calibrated with higher accuracy than what is usually required by the empirical line method (ELM), but consequently it offers benefits in robustness, ease of operation, and ability to be used on Beyond-Visual Line of Sight flights. The objective of this study was to develop and assess a drone-based workflow for direct reflectance transformation and implement it on our hyperspectral remote sensing system. A novel atmospheric correction method is also introduced, using two reference panels, but, unlike in the ELM, the correction is not directly affected by changes in the illumination. The sensor system consists of a hyperspectral camera (Rikola HSI, by Senop) and an onboard irradiance spectrometer (FGI AIRS), which were both given thorough radiometric calibrations. In laboratory tests and in a flight experiment, the FGI AIRS tilt-corrected irradiances had accuracy better than 1.9% at solar zenith angles up to 70°. The system's low-altitude reflectance factor accuracy was assessed in a flight experiment using reflectance reference panels, where the normalized root mean square errors (NRMSE) were less than ±2% for the light panels (25% and 50%) and less than ±4% for the dark panels (5% and 10%). In the high-altitude images, taken at 100–150 m altitude, the NRMSEs without atmospheric correction were within 1.4%–8.7% for VIS bands and 2.0%–18.5% for NIR bands. Significant atmospheric effects appeared already at 50 m flight altitude. The proposed atmospheric correction was found to be practical and it decreased the high-altitude NRMSEs to 1.3%–2.6% for VIS bands and to 2.3%–5.3% for NIR bands. Overall, the workflow was found to be efficient and to provide similar accuracies as the ELM, but providing operational advantages in such challenging scenarios as in forest monitoring, large-scale autonomous mapping tasks, and real-time applications. Tests in varying illumination conditions showed that the reflectance factors of the gravel and vegetation targets varied up to 8% between sunny and cloudy conditions due to reflectance anisotropy effects, while the direct reflectance workflow had better accuracy. This suggests that the varying illumination conditions have to be further accounted for in drone-based in quantitative remote sensing applications.

Highlights

  • The use of light-weight multi- and hyperspectral camera technolo­ gies are increasing rapidly in different applications

  • We present the cosine response cali­ bration and improvements made to the AIRS cosine collector optics, which improve the absolute accuracy of the AIRS irradiances in varying illumination conditions

  • In our mapping flight test, we showed that the FGI AIRS was able to measure the irradiance on board the drone with the normalized root mean square errors (NRMSE) of 1.26% in sunny conditions and 1.89% in fully cloudy conditions relative to the ASD FieldSpec irradiance probe on ground

Read more

Summary

Introduction

The use of light-weight multi- and hyperspectral camera technolo­ gies are increasing rapidly in different applications. Earlier research has shown that such hyperspectral cameras can be useful and accurate tools in drone-based remote sensing (Honkavaara et al, 2013; Suomalainen et al, 2014; Ristorto et al, 2015; de Oliveira et al, 2016; Yang et al, 2017; Barreto et al, 2019). Miniature multispectral cameras, such as MicaSense RedEdge and Altum (MicaSense, 2020a), and Tetracam MiniMCA (Tetracam, 2020), have spread widely in drone usage, as they are often smaller and cheaper than true hyperspectral cameras. Miniatur­ ized multi- and hyperspectral cameras are suitable for remote sensing drone applications, providing interesting and cost-effective techniques for accurate geometric and radiometric characterization of objects

Objectives
Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.