Abstract
Environmental perception is one of the critical aspects of autonomous driving for maritime applications, especially in fields of self-navigation and maneuver planning. For near-field recognition, this paper proposes a novel framework for data fusion, which can determine the occupied static space and track dynamic objects simultaneously. An unmanned surface vessel (USV) is equipped with LiDAR sensors, a GNSS receiver, and an Inertial Navigation System (INS). In the framework, the point cloud from LiDAR sensors is firstly clustered into various objects, then associated with known objects. After dynamic segmentation, the static objects are represented using an optimized occupancy grid map, and the dynamic objects are tracked and matched to the corresponding Automatic Identification System (AIS) messages. The proposed algorithms are validated with data collected from real-world tests, which are conducted in Rostock Harbor, Germany. After applying the proposed algorithm, the perceived test area can be represented with a 3D occupancy grid map with a 10 cm resolution. At the same time, dynamic objects in the view are detected and tracked successfully with an error of less than 10%. The plausibility of the results is qualitatively evaluated by comparing with Google Maps© and the corresponding AIS messages.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.