Abstract
Identifying terrain type is crucial to safely operating planetary exploration rovers. Vision-based terrain classifiers, which are typically trained by thousands of labeled images using machine learning methods, have proven to be particularly successful. However, since planetary rovers are to boldly go where no one has gone before, training data are usually not available a priori; instead, rovers have to quickly learn from their own experiences in an early phase of surface operation. This research addresses the challenge by combining two key ideas. The first idea is to use both onboard imagery and vibration data, and let rovers learn from physical experiences through self-supervised learning. The underlying fact is that visually similar terrain may be disambiguated by mechanical vibrations. The second idea is to employ the co- and self-training approaches. The idea of co-training is to train two classifiers separately for vision and vibration data, and re-train them iteratively on each other's output. Meanwhile, the self-training approach, applied only to the vision-based classifier, re-trains the classifier on its own output. Both approaches essentially increase the amount of labels, hence enable the terrain classifiers to operate from a sparse training dataset. The proposed approach was validated with a four-wheeled test rover in Mars-analogous terrain, including bedrock, soil, and sand. The co-training setup based on support vector machines with color and wavelet-based features successfully estimated terrain types with 82% accuracy with only three labeled images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.