Abstract
ABSTRACT Self-driving vehicles have attracted significant attention from both industry and academy. Despite the intensive research efforts on the perception model of environment-awareness, it is still challenging to attain accurate decision-making under real-world driving scenarios. Today’s state-of-the-art solutions typically hinge on end-to-end DNN-based perception-control models, which provide a rather direct way of driving decision-making. However, DNN models may fail in dealing with complex driving scenarios that require relational reasoning. This paper proposes a hierarchical perception decision-making framework for autonomous driving by employing hypergraph-based reasoning, which enables fuse multi-perceptual models to integrate multimodal environmental information. The proposed framework utilises the high-order correlations behind driving behaviours, and thus allows better relational reasoning and generalisation to achieve more precise driving decisions. Our work outperforms state-of-the-art results on Udacity, Berkeley DeepDrive Video and DBNet data sets. The proposed techniques can be used to construct a unified driving decision-making framework for modular integration of autonomous driving systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.