Deep Neural Networks (DNNs) have garnered significant attention across various research domains due to their impressive performance, particularly Convolutional Neural Networks (CNNs), known for their exceptional accuracy in image processing tasks. However, the opaque nature of DNNs has raised concerns about their trustworthiness, as users often cannot understand how the model arrives at its predictions or decisions. This lack of transparency is particularly problematic in critical fields such as healthcare, finance, and law, where the stakes are high. Consequently, there has been a surge in the development of explanation methods for DNNs. Typically, the effectiveness of these methods is assessed subjectively via human observation on the heatmaps or attribution maps generated by eXplanation AI (XAI) methods. In this paper, a novel GeoStatistics Explainable Artificial Intelligence (GSEAI) framework is proposed, which integrates spatial pattern analysis from Geostatistics with XAI algorithms to assess and compare XAI understandability. Global and local Moran’s I indices, commonly used to assess the spatial autocorrelation of geographic data, assist in comprehending the spatial distribution patterns of attribution maps produced by the XAI method, through measuring the levels of aggregation or dispersion. Interpreting and analyzing attribution maps by Moran’s I scattergram and LISA clustering maps provide an accurate global objective quantitative assessment of the spatial distribution of feature attribution and achieves a more understandable local interpretation. In this paper, we conduct experiments on aircraft detection in SAR images based on the widely used YOLOv5 network, and evaluate four mainstream XAI methods quantitatively and qualitatively. By using GSEAI to perform explanation analysis of the given DNN, we could gain more insights about the behavior of the network, to enhance the trustworthiness of DNN applications. To the best of our knowledge, this is the first time XAI has been integrated with geostatistical algorithms in SAR domain knowledge, which expands the analytical approaches of XAI and also promotes the development of XAI within SAR image analytics.
Read full abstract