The precise monitoring and quantification of fish migration are crucial for enhancing agricultural productivity and promoting environmental conservation. However, conducting these tasks in natural environments presents challenges due to the subtle characteristics of fish and the inherent complexities in detection. This study addresses these challenges by introducing DVE-YOLO (Dynamic Vision Enhanced YOLO), a novel framework based on the YOLOv8 architecture, complemented by a tailored sample allocation strategy and a dedicated loss function. Operating on dual-frame input, DVE-YOLO integrates deep features from consecutive images to create composite anchor boxes from adjacent frames. This design enables DVE-YOLO to capture dynamic object features, reveal correlations of detected objects across frames, and facilitate efficient tracking and detection. Furthermore, this research proposes an innovative method for identifying fish migration through fish counting, documenting both the migration area and the duration of fish presence for subsequent analysis. Evaluation on an extensive fish migration dataset demonstrates that DVE-YOLO outperforms YOLOv8 and other mainstream detection algorithms, showcasing superior detection accuracy with higher AP50 and AP50−95 metrics. In terms of counting accuracy, DVE-YOLO achieves a lower Mean Squared Error (MSE) compared to YOLOv8+BoTSORT and YOLOv8+ByteTrack, indicating improved counting performance. Additionally, DVE-YOLO exhibits enhanced precision in identifying fish migration in contrast to YOLOv8+BoTSORT and YOLOv8+ByteTrack. Ultimately, these machine learning methods holds promising prospects for ecological applications.
Read full abstract