Abstract

In this paper, we address the problem of video-based rain streak removal by developing an event-aware multi-patch progressive neural network. Rain streaks in video exhibit correlations in both temporal and spatial dimensions. Existing methods have difficulties in modeling the characteristics. Based on the observation, we propose to develop a module encoding events from neuromorphic cameras to facilitate deraining. Events are captured asynchronously at pixel-level only when intensity changes by a margin exceeding a certain threshold. Due to this property, events contain considerable information about moving objects including rain streaks passing though the camera across adjacent frames. Thus we suggest that utilizing it properly facilitates deraining performance non-trivially. In addition, we develop a multi-patch progressive neural network. The multi-patch manner enables various receptive fields by partitioning patches and the progressive learning in different patch levels makes the model emphasize each patch level to a different extent. Extensive experiments show that our method guided by events outperforms the state-of-the-art methods by a large margin in synthetic and real-world datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.