Voice assistants play an important role in facilitating human-machine interactions and have been widely used in audio consumer electronic products. However, it has been shown that they are susceptible to inaudible attacks in which the malicious signals are in the ultrasound regime and cannot be heard by human ears. In this study, we show that a judiciously designed acoustic metamaterial filter can mitigate such attacks by modulating the received signals by the microphones. The metamaterial filter is composed of rigid plates with individual holes which exhibit local resonance phenomena that suppress incoming waves at specific frequencies. The effectiveness of the metamaterial filter is confirmed by experiments which show that a combination of the holes can collectively distort the attack signals and protect the smart speakers. Moreover, normal audible signals are not affected by the proposed metamaterial, which adds to the flexibility of the device. The metamaterial filter has a small footprint and can be easily installed on various audio products. Our proposed strategy expands the capacity of acoustic metamaterials and improves the security of devices that use voice assistants.
Read full abstract