Abstract

This paper investigates the problem of detecting vegetation in unstructured environments for guiding an autonomous robot safely, exploiting its mobility capability in a cluttered outdoor environment. The aim is to create an adaptive learning algorithm which performs a quantitatively accurate detection that is fast enough for a real-time application. Chlorophyll-rich vegetation pixels are selected by thresholding vegetation indices, and then are considered as the seeds of a “spread vegetation”. For each seed pixel, a convex combination of color and texture dissimilarities is used to infer the difference between the pixel and its neighbors. The convex combination, trained via semi-supervised learning, models either the difference of vegetation pixels or the difference between a vegetation pixel and a non-vegetation pixel, and thus allows a greedy decision-making process to expand the spread vegetation, so-called vision-based spreading. To avoid overspreading, especially in the case of noise, a spreading scale is set. On the other hand, another vegetation spreading based on spectral reflectance is carried out in parallel. Finally, the intersection part resulting from both the vision-based and spectral reflectance-based methods is added to the spread vegetation. The approach takes into account both vision and chlorophyll light absorption properties. This enables the algorithm to capture much more detailed vegetation features than does prior art, and also give a much richer experience in the interpretation of vegetation representation, even for scenes with significant overexposure or underexposure as well as with the presence of shadow and sunshine. In all real-world experiments we carried out, our approach yields a detection accuracy of over 90%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.