Abstract

An occupancy grid map considering only geometric information is often used for autonomous mobile robots. There are various areas we do not want autonomous robots to enter outdoors, such as grass areas. These areas are not reflected in the occupancy grid map because geometric information is not sufficient to distinguish these areas. This work attempts to add semantic information about the ground surface to a prior occupancy grid map for recognizing traversable regions. We create a semantically segmented bird’s eye view (BEV) using semantic segmentation and inverse perspective mapping (IPM) and then apply a one-sided truncated Gaussian filter and binary Bayes filter to deal with the uncertainty of semantic segmentation and IPM. We tested our method on an approximately 1-km route at the University of Tsukuba and found that the recognition accuracy is highest if we apply these two filters together.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.