Abstract

High spatial resolution hyperspectral data often used in precision farming applications are not available from current satellite sensors, and difficult or expensive to acquire from standard aircraft. Alternatively, in precision farming, unmanned aerial vehicles (UAVs) are emerging as lower cost and more flexible means to acquire very high resolution imagery. Miniaturized hyperspectral sensors have been developed for UAVs, but the sensors, associated hardware, and data processing software are still cost prohibitive for use by individual farmers or small remote sensing firms. This study simulated hyperspectral image data by fusing multispectral camera imagery and spectrometer data. We mounted a multispectral camera and spectrometer, both being low cost and low weight, on a standard UAV and developed procedures for their precise data alignment, followed by fusion of the spectrometer data with the image data to produce estimated spectra for all the multispectral camera image pixels. To align the data collected from the two sensors in both the time and space domains, a post-acquisition correlation-based global optimization method was used. Data fusion, to estimate hyperspectral reflectance, was implemented using several methods for comparison. Flight data from two crop sites, one being tomatoes, and the other corn and soybeans, were used to evaluate the alignment procedure and the data fusion results. The data alignment procedure resulted in a peak R2 between the spectrometer and camera data of 0.95 and 0.72, respectively, for the two test sites. The corresponding multispectral camera data for these space and time offsets were taken as the best match to a given spectrometer reading, and used in modelling to estimate hyperspectral imagery from the multispectral camera pixel data. Of the fusion approaches evaluated, principal component analysis (PCA) based models and Bayesian imputation reached a similar accuracy, and outperformed simple spline interpolation. Mean absolute error (MAE) between predicted and observed spectra was 17% relative to the mean of the observed spectra, and root mean squared error (RMSE) was 0.028. This approach to deriving estimated hyperspectral image data can be applied in a simple fashion at very low cost for crop assessment and monitoring within individual fields.

Highlights

  • Hyperspectral sensors with many narrow spectral bands have been shown to be able to characterize vegetation type, health, and function [1,2,3,4]

  • This study demonstrated that a multispectral camera coupled with a spectrometer on a Unmanned aerial vehicles (UAVs)

  • In the sensor alignment process, as an alternative to more expensive and complex hyperspectral cameras, and as an alternative to complex positional and orientation measurement and processing, the multispectral camera and spectrometer were visually aligned in the UAV mount

Read more

Summary

Introduction

Hyperspectral sensors with many narrow spectral bands have been shown to be able to characterize vegetation type, health, and function [1,2,3,4]. Hyperspectral data were reported to perform better in modelling vegetation chlorophyll content [5]. They can be used to calculate narrow band indices for modelling crown temperature, carotenoids, fluorescence, and plant disease [6,7], as well as crop growth period [8], soil status [9], net photosynthesis, and crop water stress [10], amongst other vegetation parameters. Unmanned aerial vehicles (UAVs) are a rapidly evolving and flexible platform for remote sensing They offer a promising alternative to standard aircraft in terms of timing flexibility and capability to fly at very low altitudes, thereby acquiring imagery of very high spatial resolution

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call