Abstract

In recent years, the rapid accumulation of marine waste not only endangers the ecological environment but also causes seawater pollution. Traditional manual salvage methods often have low efficiency and pose safety risks to human operators, making automatic underwater waste recycling a mainstream approach. In this paper, we propose a lightweight multi-scale cross-level network for underwater waste segmentation based on sonar images that provides pixel-level location information and waste categories for autonomous underwater robots. In particular, we introduce hybrid perception and multi-scale attention modules to capture multi-scale contextual features and enhance high-level critical information, respectively. At the same time, we use sampling attention modules and cross-level interaction modules to achieve feature down-sampling and fuse detailed features and semantic features, respectively. Relevant experimental results indicate that our method outperforms other semantic segmentation models and achieves 74.66 % mIoU with only 0.68 M parameters. In particular, compared with the representative PIDNet Small model based on the convolutional neural network architecture, our method can improve the mIoU metric by 1.15 percentage points and can reduce model parameters by approximately 91 %. Compared with the representative SeaFormer T model based on the transformer architecture, our approach can improve the mIoU metric by 2.07 percentage points and can reduce model parameters by approximately 59 %. Our approach maintains a satisfactory balance between model parameters and segmentation performance. Our solution provides new insights into intelligent underwater waste recycling, which helps in promoting sustainable marine development.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.