Abstract

The fast development of Deepfake has brought huge current and potential future negative impacts to our daily lives. As the circulating popular Deepfake videos have become difficult to be distinguished by human eyes, various Deepfake detection approaches have been attempted using deep learning models. However, even though some existing detection methods have achieved reasonable detection performance with respect to the statistical evaluation metrics, the actual underlying Deepfake forensic traces have been barely discussed. In this study, we investigate the special forensic noise traces within Deepfake image frames and propose a noise-based Deepfake detection model approach using a deep neural network. We train a Siamese noise extractor using a novel face-background strategy to investigate different forensic noise traces of a synthesized face area and an unmodified background area. A similarity matrix module is proposed to analyze the forensic noise trace difference between a cropped face square and a cropped background square from a candidate image frame for the Deepfake detection task. As a result, our proposed model trained on the high-quality Celeb-DF dataset has achieved state-of-the-art performance with 99.15% accuracy and 99.92% AUC score on the in-dataset testing set and 88.95% AUC score on the highly difficult unknown-attack Deepfake video dataset. Furthermore, the visualization of the Deepfake forensic noise traces has shown the explicit distinction between synthesized faces and any unmodified area. We believe that the visualized evidence could provide better proof of Deepfake detection results rather than simply the statistical evaluation numbers.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.