Abstract

The better way to understand marine life and ecosystems is to surveil and analyze the activities of marine organisms. Recently, research on marine video surveillance is becoming increasingly popular. With the rapid development of deep learning (DL), convolutional neural networks (CNNs) have made remarkable progresses in image/video understanding tasks. In this article, we explore a visual attention and relation mechanism for marine organism detection, and propose a new way to apply an improved attention-relation (AR) module on an efficient marine organism detector (EMOD), which can well enhance the discrimination of organisms in complex underwater environments. We design our EMOD via integrating current state-of-the-art (SOTA) detection methods in order to detect organisms and surveil marine environments in a real time and fast fashion for high-resolution marine video surveillance. We implement our EMOD and AR on the annotated video data sets provided by the public data challenges in conjunction with the workshops (CVPR 2018 and 2019), which are supported by National Oceanic and Atmospheric Administration (NOAA) and their research works (NMFS-PIFSC-83). Experimental results and visualizations demonstrate that our application of AR module is effective and efficient, and our EMOD equipped with AR modules can outperform SOTA performance on the experimental data sets. For application requirements, we also provide the application suggestions of EMOD framework. Our code is publicly available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/zhenglab/EMOD</uri> .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.