Abstract

AbstractOptical sensors, mounted on uncrewed aerial vehicles (UAVs), are typically pointed straight downward to simplify structure-from-motion and image processing. High horizontal and vertical image overlap during UAV missions effectively leads to each object being measured from a range of different view angles, resulting in a rich multi-angular reflectance dataset. We propose a method to extract reflectance data, and their associated distinct view zenith angles (VZA) and view azimuth angles (VAA), from UAV-mounted optical cameras; enhancing plant parameter classification compared to standard orthomosaic reflectance retrieval. A standard (nadir) and a multi-angular, 10-band multispectral dataset was collected for maize using a UAV on two different days. Reflectance data was grouped by VZA and VAA (on average 2594 spectra/plot/day for the multi-angular data and 890 spectra/plot/day for nadir flights only, 13 spectra/plot/day for a standard orthomosaic), serving as predictor variables for leaf chlorophyll content (LCC), leaf area index (LAI), green leaf area index (GLAI), and nitrogen balanced index (NBI) classification. Results consistently showed higher accuracy using grouped VZA/VAA reflectance compared to the standard orthomosaic data. Pooling all reflectance values across viewing directions did not yield satisfactory results. Performing multiple flights to obtain a multi-angular dataset did not improve performance over a multi-angular dataset obtained from a single nadir flight, highlighting its sufficiency. Our openly shared code (https://github.com/ReneHeim/proj_on_uav) facilitates access to reflectance data from pre-defined VZA/VAA groups, benefiting cross-disciplinary and agriculture scientists in harnessing the potential of multi-angular datasets. Graphical abstract

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call