Abstract

The widespread Pomacea canaliculata and Pomacea maculata in North America and Asia have caused significant adverse effects on the local ecological environment and residents' health. Timely knowledge of the distribution of eggs from the two Pomacea spp. in a certain region can effectively reduce the cost of treatment and improve prevention effectiveness. Most of the existing methods are only able to identify eggs from the two Pomacea spp. or detect them in specific but not natural environments while they cannot achieve good results in the face of a complex real-world scene. This letter proposes a model for detecting eggs from the two Pomacea spp. based on dynamic convolution and multiscale feature fusion. The model can identify and locate the eggs of the two Pomacea spp. effectively. At the same time, we combined the proposed model with scale invariant feature transform (SIFT) algorithm to design a system for counting eggs of the two Pomacea spp., which can automatically identify the eggs in the actual natural environment and alleviate duplicate counting caused by image acquisition. Besides, we also built a dataset of 20,000 images of Pomacea canaliculata eggs and Pomacea maculata eggs from unmanned aerial vehicle (UAV) aerial photography. Experimental results showed that the proposed deep learning model has a better performance than others, and the proposed computer vision system can be successfully applied to support Pomacea spp. disease management.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.