The identification of individual trees is an important research topic in forestry, remote sensing and computer vision. It represents a tool for effectively and efficiently managing and maintaining forests and orchards. However, this task is not as simple as it seems; tree detection and counting can be time consuming, cost-prohibitive and accuracy-limited, especially if performed manually on a large scale.The availability of very high-resolution UAV imagery with remote sensing can make the counting process easier, faster and more precise. With the development of technology, this process can be made more automated by using intelligent algorithms such as CNN.This work presents an OBIA-CNN (Object Based Image Analysis-Convolution Neural Network) approach that combines CNNs with OBIA to automatically detect and count olive trees from Phantom4 advanced drone imagery. Initially, The CNN-based classifier was created, trained, validated, and applied to generate the Olive trees probability maps on the ortho-photo. The post-classification refinement based on OBIA was then conducted. A super-pixel segmentation and the Excess Green index were performed and a detailed accuracy analysis has been carried out to establish the suitability of the proposed method.The application to a RGB ortho-mosaic of an olive grove, in the east region of Morocco was successful using a manually elaborated training dataset of 4500 images of 24×24 pixels. Finally, the CNN detected and counted 2934 olive trees on the ortho-photo, achieving an overall accuracy of 97 % and 99 % after the OBIA refinement. The results of the proposed OBIA-CNN method were also compared with the classification results of using the Template matching technique, CNN method alone, and OBIA analysis alone to evaluate the performance of the approach. Our findings suggest the use of very high resolution images with object-based deep learning is promising for automatic detection and counting of olive trees to support the accurate and sustainable agricultural monitoring.
Read full abstract