Abstract
The use of low-cost depth imaging sensors is investigated to automate plant pathology tests. Spatial evolution is explored to discriminate plant resistance through the hypersensitive reaction involving cotyledon loss. A high temporal frame rate and a protocol operating with batches of plants enable to compensate for the low spatial resolution of depth cameras. Despite the high density of plants, a spatial drop of the depth is observed when the cotyledon loss occurs. We introduce a small and simple spatiotemporal feature space which is shown to carry enough information to automate the discrimination between batches of resistant (loss of cotyledons) and susceptible plants (no loss of cotyledons) with 97% accuracy and with a timing 30 times faster than for human annotation. The robustness of the method-in terms of density of plants in the batch and possible internal batch desynchronization-is assessed successfully with hundreds of varieties of Pepper in various environments. A study on the generalizability of the method suggests that it can be extended to other pathosystems and also to segregating plants, i.e., intermediate state with batches composed of resistant and susceptible plants. The imaging system developed, combined with the feature extraction method and classification model, provides a full pipeline with unequaled throughput and cost efficiency by comparison with the state-of-the-art one. This system can be deployed as a decision-support tool but is also compatible with a standalone technology where computation is done at the edge in real time.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.