Abstract

Deep neural networks are vulnerable to adversarial examples, which are crafted by applying small, human-imperceptible perturbations on the original images, so as to mislead deep neural networks to output inaccurate predictions. Adversarial attacks can thus be an important method to evaluate and select robust models in safety-critical applications. However, under the challenging black-box setting, most existing adversarial attacks often achieve relatively low success rates on normally trained networks and adversarially trained networks. In this paper, we regard the generation process of adversarial examples as an optimization process similar to deep neural network training. From this point of view, we introduce AdaBelief optimizer and crop invariance into the generation of adversarial examples, and propose AdaBelief Iterative Fast Gradient Method (ABI-FGM) and Crop-Invariant attack Method (CIM) to improve the transferability of adversarial examples. By adopting adaptive learning rate into the iterative attacks, ABI-FGM can optimize the convergence process, resulting in more transferable adversarial examples. CIM is based on our discovery on the crop-invariant property of deep neural networks, which we thus leverage to optimize the adversarial perturbations over an ensemble of crop copies so as to avoid overfitting on the white-box model being attacked and improve the transferability of adversarial examples. ABI-FGM and CIM can be readily integrated to build a strong gradient-based attack to further boost the success rates of adversarial examples for black-box attacks. Moreover, our method can also be naturally combined with other gradient-based attack methods to build a more robust attack to generate more transferable adversarial examples against the defense models. Extensive experiments on the ImageNet dataset demonstrate the method’s effectiveness. Whether on normally trained networks or adversarially trained networks, our method has higher success rates than state-of-the-art gradient-based attack methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call