Abstract
Deep neural networks(DNNs) are widely used in AI-controlled Cyber-Physical Systems (CPS) to controll cars, robotics, water treatment plants and railways. However, DNNs have vulnerabilities to well-designed input samples that are called adversarial examples. Adversary attack is one of the important techniques for detecting and improving the security of neural networks. Existing attacks, including state-of-the-art black-box attack have a lower success rate and make invalid queries that are not beneficial to obtain the direction of generating adversarial examples. For these reasons, this paper proposed a CMA-ES-based adversarial attack on black-box DNNs. Firstly, an efficient method to reduce the number of invalid queries is introduced. Secondly, a black-box attack of generating adversarial examples to fit a high-dimensional independent Gaussian distribution of the local solution space is proposed. Finally, a new CMA-based perturbation compression method is applied to make the process of reducing perturbation smoother. Experimental results on ImageNet classifiers show that the proposed attack has a higher success-rate than the state-of-the-art black-box attack but reduce the number of queries by 30% equally.
Highlights
Artificial intelligence techniques are increasingly applied to decision-making problems in Cyber-Physical Systems, such as robotics, autonomous vehicles, chemical plants, etc
VOLUME 7, 2019 stuck in an endless loop and even obtain worse results than before [34]–[36]. To solve this problem of invalid querying under the local information setting, we proposes a practical valid-query positioning algorithm that is based on the covariance matrix adaptation evolution strategy (CMA-ES) [37], which we called the valid evolution algorithm
This paper focuses on generating adversarial examples for the black box model under the local information setting
Summary
Artificial intelligence techniques are increasingly applied to decision-making problems in Cyber-Physical Systems, such as robotics, autonomous vehicles, chemical plants, etc. There are a few of black box attack methods that have noticed the problem of invalid queries, for example, NES+PGD [19] and decision-based attack [18]. Both methods start the attack by inputting two images simultaneously: one original image and one target image. VOLUME 7, 2019 stuck in an endless loop and even obtain worse results than before [34]–[36] To solve this problem of invalid querying under the local information setting, we proposes a practical valid-query positioning algorithm that is based on the covariance matrix adaptation evolution strategy (CMA-ES) [37], which we called the valid evolution algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have