Due to the limited detection range of optical sensors in the ocean, sonar stands out as the predominant method for detecting dynamic objects in the deep sea. Since there is stronger reflection underwater, the objects are more salient in sonar images compared to the background. In this paper, we propose a saliency detection framework for underwater moving object, which consists of three stages. In the first stage, we use optical flow to get rough global motion cues under background interference in unstable sonar platforms. In the second stage, we propose a trajectory analysis paradigm, which converts the motion cues into isolated connected domains for tracking, and then evaluates the trajectories to get the path of real target. In the third stage, we propose a salient object detection (SOD) model that predicts local saliency maps through feature fusion and multi-level supervision. The global saliency map is then obtained by remapping. Finally, extensive experiments conducted on eight sonar videos demonstrate that our proposed methodology outperforms fifteen other saliency object detection approaches.
Read full abstract