Abstract

Combining multiple satellite remote sensing sources provides a far richer, more frequent view of the earth than that of any single source; the challenge is in distilling these petabytes of heterogeneous sensor imagery into meaningful characterizations of the imaged areas. Meeting this challenge requires effective algorithms for combining multi-modal imagery over time to identify subtle but real changes among the intrinsic data variation. Here, we implement a joint-distribution framework for multi-sensor anomalous change detection (MSACD) that can effectively account for these differences in modality, and does not require any signal resampling of the pixel measurements. This flexibility enables the use of satellite imagery from different sensor platforms and modalities. We use multi-year construction of the SoFi Stadium in California as our testbed, and exploit synthetic aperture radar imagery from Sentinel-1 and multispectral imagery from both Sentinel-2 and Landsat 8. We show results for MSACD using real imagery with implanted, measurable changes, as well as real imagery with real, observable changes, including scaling our analysis over multiple years.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call