Abstract

Deep neural networks (DNNs) provide good performance in image recognition, speech recognition, and pattern analysis. However, DNNs are vulnerable to backdoor attacks. Backdoor attacks allow attackers to proactively access DNN training data to train it on additional data that are malicious, including a specific trigger. Normally, DNNs correctly classify normal data, but malicious data with a specific trigger trained by attackers can cause misclassification by DNNs. For example, if an attacker sets up a road sign that includes a specific trigger, an autonomous vehicle equipped with a DNN may misidentify the road sign and cause an accident. Thus, an attacker can use a backdoor attack to threaten the DNN at any time. However, in certain cases, when an attacker wants to perform a targeted attack, it may be desirable for the data introduced through the backdoor to be misrecognized as a particular class chosen by the attacker according to the position of a trigger. For example, if a specific trigger is attached to the top right side of the road sign, it may be misunderstood as a left-turn sign; if a specific trigger is attached to the top left side of the road sign, it may be misunderstood as a right-turn sign; and if a specific trigger is attached to the bottom left side of the road sign, it may be misunderstood as a U-turn sign. In this paper, we propose the TargetNet backdoor, which is designed to be misidentified as a particular target class chosen by the attacker according to a specific trigger location. The proposed method additionally trains the target classifier on the TargetNet backdoor data so that data with a trigger at a specific location will be misidentified as the target class selected by the attacker. We used MNIST and Fashion-MNIST as experimental datasets and Tensor-flow as a machine learning library. Experimental results show that the proposed method applied to MNIST and Fashion-MNIST has a 100% attack success rate for the TargetNet backdoor and 99.17% and 91.4% accuracy rates on normal test data, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call