Abstract

Tomato disease control is an urgent requirement in the field of intellectual agriculture, and one of the keys to it is quantitative identification and precise segmentation of tomato leaf diseases. Some diseased areas on tomato leaves are tiny and may go unnoticed during segmentation. Blurred edge also makes the segmentation accuracy poor. Based on UNet, we propose an effective image-based tomato leaf disease segmentation method called Cross-layer Attention Fusion Mechanism combined with Multi-scale Convolution Module (MC-UNet). First, a Multi-scale Convolution Module is proposed. This module obtains multiscale information about tomato disease by employing 3 convolution kernels of different sizes, and it highlights the edge feature information of tomato disease using the Squeeze-and-Excitation Module. Second, a Cross-layer Attention Fusion Mechanism is proposed. This mechanism highlights tomato leaf disease locations via gating structure and fusion operation. Then, we employ SoftPool rather than MaxPool to retain valid information on tomato leaves. Finally, we use the SeLU function appropriately to avoid network neuron dropout. We compared MC-UNet to the existing segmentation network on our self-built tomato leaf disease segmentation dataset and MC-UNet achieved 91.32% accuracy and 6.67M parameters. Our method achieves good results for tomato leaf disease segmentation, which demonstrates the effectiveness of the proposed methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.