Abstract

Wheat scab is one of the most important diseases endangering the health of wheat which severely affects the yield and quality of wheat. Thus, the diagnosis of wheat scab is very important. However, it is difficult to distinguish between asymptomatic wheats without visible spots on the surface and healthy wheats by traditional visual methods under natural conditions, which has greatly hindered the diagnosis of wheat scab. To address the challenge of poor model classification caused by the difficulty in distinguishing asymptomatic wheats from healthy wheats, we use near-infrared spectral data with healthy, symptomatic and indistinguishable asymptomatic wheats and propose a new approach Transfer Learning and Neural Architecture Search for Near-infrared with Convolutional Networks and Recurrent Networks (TranNas-NirCR). This approach integrates neural architecture search with transfer learning and employs a combination of convolutional neural networks and recurrent neural networks in the search space. Compared to other methods, the TranNas-NirCR method achieved better classification results with an accuracy of 90.42%, which is 2.68% higher than support vector machines (SVM), 5.36% higher than neural architecture search (NAS), and 4.21% higher than Transfer Learning with Neural Architecture Search (Tran_NAS). This method shows strong generalization performance in the case of only a small amount of near-infrared spectral data, which is of referential significance for diagnosing early wheat scab in real conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.