Abstract

Rain streaks severely affect the perception of the content and structure of an image so high-performance deraining algorithms are needed in order to eliminate the effects of various rain streaks for high-level computer vision tasks. Although much progress has been made with existing deraining methods, the task of single image deraining remains challenging. In this paper, we first point out that existing Transformers lack sufficient ability to capture channel attention which restricted the ability of models in deraining. To improve the performance of deraining model, we propose a dual branch deraining network based on Transformer. One branch uses dense connections to connect Transformer modules which embed the attention of a composite channel. This branch captures channel attention more finely to learn the representation of rain streaks features. The other branch first obtains features at different scales by gradually expanding the receptive field, then uses these features to obtain attention for regional features, and finally uses the attention to guide the model to focus on areas of high rain streaks density and large scales. By fusing these two branches, the model is able to capture channel attention more finely and to focus on regions of high rain streaks density and large scales. The extensive experimental results on synthetic and real datasets demonstrate that the proposed method outperforms most advanced deraining methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.