Abstract

In recent years, deep learning models have resulted in massive progress in insect recognition tasks. However, training deep neural networks requires a large amount of data, while data collecting and labeling are time-consuming and labor-intensive. This study proposes a method for establishing a synthetic image dataset of stored-product insects for providing well-labeled image data for insect detection tasks. Proxy virtual worlds are leveraged to obtain well-labeled synthetic data. A dynamic generation approach is presented to generate synthetic images with diverse insect targets, various backgrounds, and changing lighting conditions using the camera module in the constructed virtual scene. The bounding boxes&#x2019; coordinates and the category label of insect targets in each synthetic image are obtained by calculating the geometrical relationships between insect targets and the camera module. A texture translation network is developed to conduct image-to-image translation, launching to enhance the verisimilitude of synthetic images. A synthetic dataset is established for three species of insects: <i>Cyptolestes ferrugineus</i> (Stephens), <i>Sitophilus oryzae</i> (Linnaeus), and <i>Tribolium castaneum</i> (Herbst).A set of assessments are introduced to evaluate the synthetic image dataset, including statistical characteristics and experimental verifications. Experimental results demonstrate that using synthetic data might reduce the demand for real data. The proposed method might give a novel solution for providing training data with well-labeled annotations for insect detection without tedious image collection and manual labeling.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.