Abstract

Biometric-based personal identification models are generally considered to be accurate and secure because biological signals are too complex and person-specific to be fabricated, and EMG signals, in particular, have been used as biological identification tokens due to their high dimension and non-linearity. We investigate the possibility of effectively attacking EMG-based identification models with adversarial biological input via a novel EMG signal individual-style transformer based on a generative adversarial network and tiny leaked data segments. Since two same EMG segments do not exist in nature; the leaked data can't be used to attack the model directly or it will be easily detected. Therefore, it is necessary to extract the style with the leaked personal signals and generate the attack signals with different contents. With our proposed method and tiny leaked personal EMG fragments, numerous EMG signals with different content can be generated in that person's style. EMG hand gesture data from eighteen subjects and three well-recognized deep EMG classifiers were used to demonstrate the effectiveness of the proposed attack methods. The proposed methods achieved an average of 99.41% success rate on confusing identification models and an average of 91.51% success rate on manipulating identification models. These results demonstrate that EMG classifiers based on deep neural networks can be vulnerable to synthetic data attacks. The proof-of-concept results reveal that synthetic EMG biological signals must be considered in biological identification system design across a vast array of relevant biometric systems to ensure personal identification security for individuals and institutions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call