Abstract

The recent technological advances and their applications to agriculture provide leverage for the new paradigm of smart agriculture. Remote sensing applications can help optimise resources, making agriculture more ecological, increasing productivity and helping farmers to anticipate events that could not otherwise be avoided. Considering that losses caused by anomalies such as diseases, weeds and pests account for 20–40 % of overall agricultural productivity, a successful research effort in this area would be a breakthrough for agriculture. In this paper, we propose a methodology with which to discover and classify anomalies in images of crops, taken from a wide range of distances, using different Convolutional Neural Network architectures. This methodology also deals with several difficulties that usually appear in this kind of problems, such as class imbalance, the insufficient and small variety of images, overtraining or lack of models generalisation. We have implemented four convolutional neural network architectures in a high-performance computing environment, and propose a methodology based on data augmentation with the addition of Gaussian noise to the images to solve the above problems. Our approach was tested using two well-established open datasets that are unalike: DeepWeeds, which provides a classification of 8 weed species native to Australia using images that were taken at a distance of 1 m, and Agriculture-Vision, which classifies 6 types of crop anomalies using multispectral satellite imagery. Our methodology attained accuracies of 98 % and 95.3% respectively, improving the state-of-the-art by several points. In order to ease reproducibility and model selection, we have provided a comparison in terms of computational time and other metrics, thus enabling the choice between architectures to be made according to the resources available. The complete code is available in an open repository in order to encourage reproducibility and promote scientific advances in sustainable agriculture.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call