Abstract

The fusion of synthetic aperture radar (SAR) and optical satellite data is widely used for deep learning based scene classification. Counter-intuitively such neural networks are still sensitive to changes in single data sources, which can lead to unexpected behavior and a significant drop in performance when individual sensors fail or when clouds obscure the optical image. In this paper we incorporate source-wise out-of-distribution (OOD) detection into the fusion process at test time in order to not consider unuseful or even harmful information for the prediction. As a result, we propose a modified training procedure together with an adaptive fusion approach that weights the extracted information based on the source-wise in-distribution probabilities. We evaluate the proposed approach on the BigEarthNet multilabel scene classification data set and several additional OOD test cases as missing or damaged data, clouds, unknown classes, and coverage by snow and ice. The results show a significant improvement in robustness to different types of OOD data affecting only individual data sources. At the same time the approach maintains the classification performance of the baseline approaches compared. The code for the experiments of this paper is available on GitHub: https://github.com/JakobCode/OOD_DataFusion

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call