Abstract

Unmanned aerial vehicles (UAVs) equipped with imaging and ranging sensors have become an effective remote sensing data acquisition tool for digital agriculture. Among potential products derived from UAVs, high-resolution orthophotos play an important role in several phenotyping activities, such as canopy cover estimation and flowering date identification. Current structure from motion (SfM) tools for image-based 3-D reconstruction and orthophoto generation cannot perform well when working with large-scale imagery over mechanized agricultural fields. This failure is mainly due to their inability to identify enough conjugate points among overlapping images captured at low altitudes. This study addresses such limitation through a new strategy that uses plant row segments as linear features in the triangulation process. The linear features are derived in two steps. First, an automated approach is implemented to extract plant row segments from the LiDAR data which are then back-projected to the imagery using available trajectory and system calibration parameters. In the second step, a machine-assisted strategy is used to adjust the line segments in image space for deriving accurate linear features. In the proposed framework, the triangulation process is conducted by investigating two mathematical models—referred to as object-space and image-space coplanarity constraints—for incorporating linear features in the bundle adjustment (BA). The orthophoto is generated using the refined trajectory and system calibration parameters derived from the BA process. Several experimental results over an agricultural filed show that the proposed framework outperforms commonly used SfM tools, e.g., Pix4D Mapper Pro and Agisoft Metashape in terms of generating orthophotos with high visual quality and geolocation accuracy. Also, results indicate that the object-space coplanarity constraint is more robust against potential noise in line measurements when compared to the image-space coplanarity model. However, both models lead to high absolute accuracy in the range of ±2–4 cm when the noise level in the image measurements of points along the line is reasonable, i.e., ~5–10 pixels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call