Abstract
Inevitable cloud regions dramatically degrade the visibility of remote sensing images, and resulting cloud detection is one of the essential prerequisites for applications of remote sensing images. In Refined UNets, we explored solutions to the edge-precise cloud and shadow detection for remote sensing images of the Landsat 8 OLI set; in particular, v2 and v3 attempted to provide end-to-end inference prototypes. Refined UNets exploited iterative CRF inference to refine edges of cloud and shadow regions, making it difficult to train the entire model in a joint way. We instead turn to investigate an entirely differentiable model to satisfy the requirement of end-to-end training. To this end, we present an efficient and lightweight model to identify cloud regions, referred to as Refined UNet lite, which is able to facilitate end-to-end training and inference and partially contributes to edge-precise cloud detection. Specifically, the UNet backbone locates cloud regions coarsely and the subsequent guided-filter layer refines their finegrained edges. All operators within our model are fully differentiable, leading to an end-to-end trainable model instead of a partially trainable one. In addition, the practical time consumptions significantly decrease due to the lightweight architecture of our model. The comprehensive experiments confirm our contributions to efficient edge-precise cloud detection. The implementation of Refined UNet lite is currently available at https://github.com/92xianshen/refined-unet-lite.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.