Abstract

The development of intensive aquaculture has increased the need for video-based underwater monitoring technology to generate statistics on multi-class fish. However, the complex marine environment, e.g., light fluctuations, shape deformations, similar appearance of fish, and occlusions, makes this a challenging task. Therefore, there are relatively few studies in this field. This paper proposes a real-time multi-class fish stock statistics method (RMCF). The accuracy of fish stock statistics has reached 95.6% over the previous best approach. The proposed method uses YOLOv4 as a backbone network and a parallel two-branch structure based on deep learning to perform real-time detection and tracking of fish in a real marine ranch environment. The two-branch structure contains detection and tracking branches, where the detection branch detects fish species and improves tracking accuracy and online tracking time. The tracking branch tracks the fish and making a number statistics. Finally, we combine the detection and tracking branches to generate multi-class fish stock statistics. Here, the detection branch helps the tracking branch realize multi-class tracking. With the tracking results, we further analyze the changing trends of different fish over time. Compared to state-of-the-art video tracking and detection methods, the experiment results demonstrate the proposed method provides better fish detection and tracking performance in a complex real-world marine environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.