Abstract
Deep learning has shown superiority in dealing with complicated and professional tasks (e.g., computer vision, audio, and language processing). However, research works have confirmed that Deep Neural Networks (DNNs) are vulnerable to carefully crafted adversarial perturbations, which cause DNNs confusion on specific tasks. In object detection domain, the background has little contributions to object classification, and the crafted adversarial perturbations added to the background do not improve the adversary effect in fooling deep neural detection models yet induce substantial distortions in generated examples. Based on such situation, we introduce an adversarial attack algorithm named Adaptive Object-oriented Adversarial Method (AO2AM). It aims to fool deep neural object detection networks with the adversarial examples by applying the adaptive cumulation of object-based gradients and adding the adaptive object-based adversarial perturbations merely onto objects rather than the whole frame of input images. AO2AM can effectively make the representations of generated adversarial samples close to the decision boundary in the latent space, and force deep neural detection networks to yield inaccurate locations and false classification in the process of object detection. Compared with existing adversarial attack methods which generate adversarial perturbations acting on the global scale of the original inputs, the adversarial examples produced by AO2AM can effectively fool deep neural object detection networks and maintain a high structural similarity with corresponding clean inputs. Performing adversarial attacks on Faster R-CNN, AO2AM gains attack success rate (ASR) over 98.00% on pre-processed Pascal VOC 2007&2012 (Val), and reaches SSIM over 0.870. In Fooling SSD, AO2AM receives SSIM exceeding 0.980 on L2 norm constraint. On SSIM and Mean Attack Ratio, AO2AM outperforms adversarial attack methods based on global scale perturbations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.