Abstract

Shadow detection is undergoing a rapid and remarkable development along with the wide use of deep neural networks. Benefiting from a large number of training images annotated with strong pixel-level ground-truth masks, current deep shadow detectors have achieved state-of-the-art performance. However, it is expensive and time-consuming to provide the pixel-level ground-truth mask for each training image. Considering that, this paper proposes the first unsupervised deep shadow detection framework, which consists of an initial pseudo label generation (IPG) module, a curriculum learning (CL) module and a self-training (ST) module. The supervision signals used in our learning framework are generated from several existing traditional unsupervised shadow detectors, which usually contain a lot of noisy information. Therefore, each module in our unsupervised framework is dedicated to reduce the adverse influence of noisy information on model training. Specifically, the IPG module combines different traditional unsupervised shadow maps to obtain their complementary shadow information. After obtaining the initial pseudo labels, the CL module and the ST module will be used in conjunction to gradually learn new shadow patterns and update the qualities of pseudo labels simultaneously. Extensive experimental results on various benchmark datasets demonstrate that our deep shadow detector not only outperforms the traditional unsupervised shadow detection methods by a large margin but also achieves comparable results with some recent state-of-the-art fully-supervised deep shadow detection methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.