Abstract

Compact multi-spectral sensors that can be mounted on lightweight drones are now widely available and applied within the geo- and environmental sciences. However; the spatial consistency and radiometric quality of data from such sensors is relatively poorly explored beyond the lab; in operational settings and against other sensors. This study explores the extent to which accurate hemispherical-conical reflectance factors (HCRF) and vegetation indices (specifically: normalised difference vegetation index (NDVI) and chlorophyll red-edge index (CHL)) can be derived from a low-cost multispectral drone-mounted sensor (Parrot Sequoia). The drone datasets were assessed using reference panels and a high quality 1 m resolution reference dataset collected near-simultaneously by an airborne imaging spectrometer (HyPlant). Relative errors relating to the radiometric calibration to HCRF values were in the 4 to 15% range whereas deviations assessed for a maize field case study were larger (5 to 28%). Drone-derived vegetation indices showed relatively good agreement for NDVI with both HyPlant and Sentinel 2 products (R2 = 0.91). The HCRF; NDVI and CHL products from the Sequoia showed bias for high and low reflective surfaces. The spatial consistency of the products was high with minimal view angle effects in visible bands. In summary; compact multi-spectral sensors such as the Parrot Sequoia show good potential for use in index-based vegetation monitoring studies across scales but care must be taken when assuming derived HCRF to represent the true optical properties of the imaged surface.

Highlights

  • Spectral information and derived vegetation indices are at the core of a wide array of methodologies for the monitoring of vegetation

  • This study explores the extent to which accurate hemispherical-conical reflectance factors (HCRF) and vegetation indices (: normalised difference vegetation index (NDVI) and chlorophyll red-edge index (CHL)) can be derived from a low-cost multispectral drone-mounted sensor (Parrot Sequoia)

  • This study focused on the assessment of accuracy and spatial consistency of drone-based HCRF acquired during ideal illumination conditions, as well as the accuracy of vegetation indices and their comparability to products from other sensors which included a fine-resolution HCRF comparison with simulated spectral bands using the HyPlant DUAL hyperspectral imaging sensor

Read more

Summary

Introduction

Spectral information and derived vegetation indices are at the core of a wide array of methodologies for the monitoring of vegetation. In tandem with the development and production of compact and lightweight turnkey multispectral sensors, there has been a surge in drone-based multispectral data applications during recent years which make use of the fine spatial resolution or frequent revisit capabilities [12,13,14,15]. Popular sensor solutions are multi-camera array (MCA) systems such as the Parrot Sequoia (Parrot SA, France) and Micasense RedEdge (Micasense, US) which consist of individual cameras with different band-pass filters to record reflected light at specific narrow (10–40 nm) wavelength intervals. Designed primarily for applications in precision agriculture, these sensors exhibit a number of qualities which have made them attractive to the scientific community, primarily their low financial cost, simple integration into lightweight drone systems and accompanying software options

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.