In recent years, increasing attention has been paid to object detection in remote sensing imagery. However, traditional optical detection is highly susceptible to illumination and weather anomalies. It is a challenge to effectively utilize the cross-modality information from multisource remote sensing images, especially from optical and synthetic aperture radar images, to achieve all-day and all-weather detection with high accuracy and speed. Toward this end, a fast multisource fusion detection framework FusionDet is proposed in the current paper. A novel distance-decay intersection over union is employed to encode the shape properties of the targets with scale invariance. Therefore, the same target in multisource images can be paired accurately. Furthermore, the weighted Dempster–Shafer evidence theory is utilized to combine the optical and synthetic aperture radar detection, which overcomes the drawback in a feature-level fusion that requires a large amount of paired data. In addition, the paired optical and synthetic aperture radar images are taken to demonstrate our fusion algorithm. Considerable experiments and ablation studies on multiple datasets show that our method is generally effective and achieves state-of-the-art detection performance.
Read full abstract