Abstract

Current deep learning-based image manipulation localization methods achieve impressive performance when rich spatial features and information are fully utilized. However, most of them suffer from the irrelevance of semantic awareness when identifying various manipulation categories. This leads to false alarms on recognizing forged regions. In this paper, we propose a Progressively-Refined Neural Network (PR-Net), to localize the tampered regions progressively under a coarse-to-fine workflow. Specifically, PR-Net is composed of a Feature Extractor (FE) that captures feature intrinsic correlations and a Mask Generation Module (MGM) with three refining generators. The FE takes a CNN to extract the image features and introduces an attention mechanism Convolution Block Attention Module (CBAM) to suppress the image content and guide the extractor in exploring the inconsistencies between the manipulated and authentic regions. The MGM comprises three generators where the Coarse Mask RR-Generator generates a localization result roughly, the Candidate Mask RR-Generator generates a possible tampered region according to the rough localization measure, and the Fine Mask RR-Generator produces the final prediction of manipulated regions. We also utilize the Rotated Residual (RR) structure to suppress the image content during the generative process. The extensive experimental results on four benchmark data sets (NIST16, COVER, CASIA v1.0, and In-The-Wild) demonstrate the superior performance of PR-Net compared with the state-of-the-art methods in localizing the manipulated regions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.