Abstract

With the rapid development of low-end Internet of Things (IoT) devices and shipborne sensors, efficient multi-source data fusion methods for autonomous surface vehicles (ASVs) have recently attracted significant research interest in intelligent edge-enabled maritime applications. The data fusion capacity can enhance the situation awareness of ASVs, leading to improved efficacy and safety in ASV-empowered maritime IoT (MIoT). Both cameras and automatic identification system (AIS) equipment, which provide visual and positioning information, respectively, have become the commonly adopted cost-effective sensors. In this work, we first introduce a lightweight YOLOX-s network with transfer learning to accurately and robustly detect the moving vessels at different scales in real time. A data augmentation method is then proposed to promote its generalization ability. The detected vessels and synchronous AIS messages are finally fused to make full use of the multi-source sensing data, contributing to an augmented reality (AR)-based maritime navigation system at the shipborne intelligent edges. The AR system is able to superimpose both static and dynamic information from the collected AIS messages onto the video-captured images. It has the capacity of providing auxiliary information for early warning of navigation risks for ASVs in MIoT networks. Compared with traditional single-sensor-based navigation methods, our data fusion framework exhibits more reliable and robust results, and appears to have substantial practical potential applications. Extensive experiments have been conducted to demonstrate the superior performance of our framework under different navigational conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call