Abstract

Automated and cost-effective phenotyping pipelines are needed to efficiently characterize new lines and hybrids developed in plant breeding programs. In this study, we employ deep neural networks (DNNs) to model individual maize plants using 3D point cloud data derived from unmanned aerial systems (UAS) imagery by PointNet network. The experimental setup was performed at the Indiana Corn and Soybean Innovation Center at the Agronomy Center for Research and Education (ACRE) in West Lafayette, Indiana, USA. On June 17th, 2020 a flight was carried out over maize trials using a custom designed UAS platform with a Sony Alpha ILCE-7R photogrammetric sensor. RGB images were processed by a standard photogrammetric pipeline by Structure from Motion (SfM) to reconstruct the study field into a final scaled 3D point cloud. 50 individual maize plants were manually segmented from the point cloud to train the DNN and subsequently individual plants were extracted over a test trial with more than 5,000 plants. Moreover, to reduce overfitting in the fully-connected layers, we employed data augmentation not only in translation, but also in color intensity. Results show a successful rate for the extraction of the individual plants of 72.4%. Our test trial demonstrates the possibility of using deep learning to overcome the individual maize extraction challenge on the basis of UAS data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call