Abstract

Phenomics technologies allow quantitative assessment of phenotypes across a larger number of plant genotypes compared to traditional phenotyping approaches. The utilization of such technologies has enabled the generation of multidimensional plant traits creating big datasets. However, to harness the power of phenomics technologies, more sophisticated data analysis methods are required. In this study, Aphanomyces root rot (ARR) resistance in 547 lentil accessions and lines was evaluated using Red-Green-Blue (RGB) images of roots. We created a dataset of 6,460 root images that were annotated by a plant breeder based on the disease severity. Two approaches, generalized linear model with elastic net regularization (EN) and convolutional neural network (CNN), were developed to classify disease resistance categories into three classes: resistant, partially resistant, and susceptible. The results indicated that the selected image features using EN models were able to classify three disease categories with an accuracy of up to 0.91 ± 0.004 (0.96 ± 0.005 resistant, 0.82 ± 0.009 partially resistant, and 0.92 ± 0.007 susceptible) compared to CNN with an accuracy of about 0.84 ± 0.009 (0.96 ± 0.008 resistant, 0.68 ± 0.026 partially resistant, and 0.83 ± 0.015 susceptible). The resistant class was accurately detected using both classification methods. However, partially resistant class was challenging to detect as the features (data) of the partially resistant class often overlapped with those of resistant and susceptible classes. Collectively, the findings provided insights on the use of phenomics techniques and machine learning approaches to provide quantitative measures of ARR resistance in lentil.

Highlights

  • Crop phenotyping refers to a key process in crop improvement programs, associated with the evaluation of expressed plant traits as a result of interaction between the genotype and the environment

  • In an effort to assist in the process of phenotyping, in our previous work [20], we evaluated the potential of RGB and hyperspectral image features extracted from lentil shoots/ roots integrated with an elastic net regression model for disease class prediction

  • The third experiment was conducted in November 2018 and consisted of 334 lentil accessions from the lentil single plantderived (LSP) grown in a randomized complete block design with ten replicates

Read more

Summary

Introduction

Crop phenotyping refers to a key process in crop improvement programs, associated with the evaluation of expressed plant traits as a result of interaction between the genotype and the environment. Few studies have started to focus on the explanation of these “black boxes” (process of inference/decision) associated with DL architecture [14, 18] Approaches such as topK high-resolution profile maps were proposed to visualize the predictions associated with DL-based detection of foliar stress symptoms in soybean to better understand the model application [14]. In an effort to assist in the process of phenotyping, in our previous work [20], we evaluated the potential of RGB and hyperspectral image features extracted from lentil shoots/ roots integrated with an elastic net regression model for disease class prediction. Given the potential benefits of DL tools, in this study, we built and compared two approaches, generalized linear model with elastic net regularization (EN) and deep learning (CNN) models, to classify ARR disease severity using lentil root images into three (resistant, partially resistant, and susceptible) classes with a larger dataset

Materials and Methods
Results
Discussion
Conflicts of Interest

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.