Abstract

Estimating the number of fries plays a critical role in the maintenance of fish breeding, transportation, and the preservation of marine resources in aquaculture. Generally speaking, statistics are recorded manually by fishers and government units. Manual recording is time-consuming and increases the workload of fishers. Compared with traditional physical shunt devices, visual-based algorithms have benefits such as non-restriction of labors, minimal equipment installation, and maintenance costs. However, these methods generally come with massive calculations and model parameters, or poor abilities of aggregation handles and counting precision. This paper proposes a fry counting method named MSENet for portable fry counting devices. Firstly, the lightweight network is designed with simpler parameters (Params: 139.46 kB) for portable embedding. The visualized single-channel fry density maps are predicted by feeding the original images and the number of fries is calculated through integration. Then, the Squeeze-and-Excitation block is utilized to strengthen the features of weighty channels. The model training is refined by hyperparameter studies, the shortened preparation stage enhances the portability. What is more, a fry counting dataset NCAUF and an extra set NCAUF-EX are built for verifications of network generalization. The results demonstrate that the lightweight MSENet outperforms in fry counting with higher precision and competently solves the issue of fry aggregation (MAE: 3.33). The source code and pre-trained models are available at: https://github.com/vranlee/MSENet.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.