Abstract
Recently, real-time obstacle detection by monocular vision exhibits a promising prospect in enhancing the safety of unmanned surface vehicles (USVs). Since the obstacles that may threaten USVs generally appear below the water edge, most existing methods first detect the horizon line and then search for obstacles below the estimated horizon line. However, these methods detect horizon line only using edge or line features, which are susceptible to interference edges from clouds, waves, and land, eventually resulting in poor obstacle detection. To avoid being affected by interference edges, in this paper, we propose a novel horizon line detection method based on semantic segmentation. The method assumes a Gaussian mixture model (GMM) with spatial smoothness constraints to fit the semantic structure of marine images and simultaneously generate a water segmentation mask. The horizon line is estimated from the water boundary points via straight line fitting. Further, inspired by human visual attention mechanisms, an efficient saliency detection method based on background prior and contrast prior is presented to detect obstacles below the estimated horizon line. To reduce false positives caused by sun glitter, waves and foam, the continuity of the adjacent frames is employed to filter the detected obstacles. An extensive evaluation was conducted on a large marine image dataset collected by our ‘Jinghai VIII’ USV. The experimental results show that the proposed method significantly outperformed the recent state-of-the-art marine obstacle method by 22.07% in terms of F-score while running over 24 fps on an NVIDIA GTX1080Ti GPU.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.