Abstract

Nowadays, more than in the past, research is focused on the study and development of enabling technologies to achieve the goal of autonomous navigation. The investigations cover all industrial fields: automotive, aerospace, and maritime. The first level to be developed to achieve the goal of autonomous vehicles is obstacle detection and consecutively object tracking in real-time. The most important results were obtained in the automotive sector, where the available financial resources are more significant than the others. Indeed, several methods based on trained neural networks are used to detect the obstacles. Usually, the neural networks are trained by using large datasets of LiDAR point clouds and images. Unfortunately, it is impossible to emulate this approach in the marine sector because there are no large datasets for training a neural network. For such a reason, this paper aims to present multi-object tracking based on unsupervised learning. The proposed method is tailored for the challenging marine environment, and it is suitable for the detection and tracking of both fixed and moving obstacles. This paper presents a low-computational method for multi-object tracking based on unsupervised learning, and it is discussed and analysed step by step, indicating the pros and weaknesses. Moreover, the tracking has been tested on both experimental LiDAR point clouds and virtual LiDAR point clouds created employing a tailored virtual scenario. The results are reported and discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.