Abstract

Phytoplankton play a critical role in marine food webs and biogeochemical cycles, and their abundance must be monitored to prevent disasters and improve the marine environment. Although existing algorithms for automatic phytoplankton identification at the image level are available, there are currently no video-level algorithms. This lack of datasets is a significant obstacle to the development of video-level automatic identification algorithms for phytoplankton observations. Deep learning-based algorithms, in particular, require high-quality datasets to achieve optimal results. To address this issue, we propose the PMOT2023 (Phytoplankton Multi-Object Tracking), a multi-video tracking dataset based on 48,000 micrographs captured by in situ observation devices. The dataset comprises 21 classes of phytoplankton and can aid in the development of advanced video-level identification methods. Multi-object tracking algorithms can detect, classify, count, and estimate phytoplankton density. As a video-level automatic identification algorithm, multi-object tracking addresses trajectory tracking, concentration estimation, and other requirements in original phytoplankton observation, helping to prevent marine ecological disasters. Additionally, the PMOT2023 dataset will serve as a benchmark to evaluate the performance of future phytoplankton identification models and provide a foundation for further research on automatic phytoplankton identification algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.