Abstract

Automated methods intended for image classification have become increasingly popular in recent years, with applications in the agriculture field including weed identification, fruit classification, and disease detection in plants and trees. In image classification, convolutional neural networks (CNN) have already shown exceptional results but the problem with these models is that these models cannot extract some relevant image features of the input image. On the other hand, the recurrent neural network (RNN) can fully exploit the relationship among image features. In this paper, the performance of combined CNN and RNN models is evaluated by extracting relevant image features on images of diseased apple leaves. This article suggested a combination of pre-trained CNN network and LSTM, a particular type of RNN. With the use of transfer learning, the deep features were extracted from several fully connected layers of pre-trained deep models i.e. Xception, VGG16, and InceptionV3. The extracted deep features from the CNN layer and RNN layer were concatenated and fed into the fully connected layer to allow the proposed model to be more focused on finding relevant information in the input data. Finally, the class labels of apple foliar disease images are determined by the integrated model for apple foliar disease classification, experimental findings demonstrate that the proposed approach outperforms individual pre-trained models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call