Abstract
Spectral confusion among land cover classes is quite common, let alone in a complex and heterogenous system like the semi-arid Mediterranean environment; thus, employing new developments in remote sensing, such as multispectral imagery (MSI) captured by unmanned aerial vehicles (UAVs) and airborne light detection and ranging (LiDAR) techniques, with deep learning (DL) algorithms for land cover classification can help to address this problem. Therefore, we propose an image-based land cover classification methodology based on fusing multispectral and airborne LiDAR data by adopting CNN-based semantic segmentation in a semi-arid Mediterranean area of northeastern Aegean, Greece. The methodology consists of three stages: (i) data pre-processing, (ii) semantic segmentation, and (iii) accuracy assessment. The multispectral bands were stacked with the calculated Normalized Difference Vegetation Index (NDVI) and the LiDAR-based attributes height, intensity, and number of returns converted into two-dimensional (2D) images. Then, a hyper-parameter analysis was performed to investigate the impact on the classification accuracy and training time of the U-Net architecture by varying the input tile size and the patch size for prediction, including the learning rate and algorithm optimizer. Finally, comparative experiments were conducted by altering the input data type to test our hypothesis, and the CNN model performance was analyzed by using accuracy assessment metrics and visually comparing the segmentation maps. The findings of this investigation showed that fusing multispectral and LiDAR data improves the classification accuracy of the U-Net, as it yielded the highest overall accuracy of 79.34% and a kappa coefficient of 0.6966, compared to using multispectral (OA: 76.03%; K: 0.6538) or LiDAR (OA: 37.79%; K: 0.0840) data separately. Although some confusion still exists among the seven land cover classes observed, the U-Net delivered a detailed and quite accurate segmentation map.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.