Abstract
Abstract Inland waterway transportation remains a crucial and cost-effective mode of global transport despite challenges such as skilled personnel shortages, increasing competition, and ecological impacts. Autonomous technology enhances vessel navigation, scheduling, safety, and efficiency, making it a viable solution for smart ship and port operations. Reliable object detection is essential for autonomous ships to navigate safely and avoid collisions with static structures like bridges, piers, bollards, and locks. This paper presents an innovative approach for training a network using the Autodistill pipeline for bridge detection and segmentation. We generate a labeled bridge dataset using GroundedSAM, which integrates Grounding DINO and the Segment Anything Model (SAM) to detect and segment regions based on text input. The system focuses on identifying ‘bridge’ and ‘water’ classes, producing high-quality labeled data. Manual filtering improves label quality, enhancing the training of the YOLOv8 model, known for its superior object detection capabilities. Our approach demonstrates high performance in accurately detecting bridges, confirmed through evaluations with and without manual filtering. To validate our solution’s feasibility in real-world applications, we deployed the model on a NVIDIA Jetson AGX Orin for performance evaluation. Future work will extend this approach to additional static and mobile objects relevant to smart ship and port operations, such as ship locks and various ship types.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.