Edge computing on mobile marine platform is paramount for automated ecological monitoring. The goal of demonstrating the computational feasibility of an Artificial Intelligence (AI)-powered camera for fully automated real-time species-classification on deep-sea crawler platforms was searched by running You-Only-Look-Once (YOLO) model on an edge computing device (NVIDIA Jetson Nano), to evaluate the achievable animal detection performances, execution time and power consumption, using all the available cores. We processed a total of 337 rotating video scans (∼180°), taken during approximately 4 months in 2022 at the methane hydrates site of Barkley Canyon (Vancouver Island; BC; Canada), focusing on three abundant species (i.e., Sablefish Anoplopoma fimbria, Hagfish Eptatretus stoutii, and Rockfish Sebastes spp.). The model was trained on 1926 manually annotated video frames and showed high detection test performances in terms of accuracy (0.98), precision (0.98), and recall (0.99). The trained model was then applied on 337 videos.In 288 videos we detected a total of 133 Sablefish, 31 Hagfish, and 321 Rockfish nearly in real-time (about 0.31 s/image) with very low power consumption (0.34 J/image). Our results have broad implications on intelligent ecological monitoring. Indeed, YOLO model can meet operational-autonomy criteria for fast image processing with limited computational and energy loads.
Read full abstract