Abstract

Deep neural networks are known to be vulnerable to adversarial examples, where adding carefully crafted adversarial perturbations to the inputs can mislead the DNN model. However, it is challenging to generate effective adversarial examples in the physical world due to many uncontrollable physical dynamics, which pose security and safety threats in the real world. Current physical attack methods aim to generate robust physical adversarial examples by simulating all possible physical dynamics. If attacking a new image or a new DNN model, they require expensive manual efforts for simulating physical dynamics or considerable time for iteratively optimizing. To tackle these limitations, we propose a robust and generalized physical adversarial attack method with Meta-GAN (Meta-GAN Attack), which is able to not only generate robust physical adversarial examples, but also generalize to attacking novel images and novel DNN models by accessing a few digital and physical images. First, we propose to craft robust physical adversarial examples with a generative attack model via simulating color and shape distortions. Second, we formulate the physical attack as a few-shot learning problem and design a novel class-agnostic and model-agnostic meta-learning algorithm to solve this problem. Extensive experiments on two benchmark datasets with four challenging experimental settings verify the superior robustness and generalization of our method by comparing to state-of-the-art physical attack methods. The source code is released at github.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.