Abstract

Although crowdsensing has emerged as a popular information collection paradigm, its security and privacy vulnerabilities have come to the forefront in recent years. However, one big limitation of previous research is that the security domain and the privacy domain are typically considered separately. Therefore, it is unclear whether the defense methods in the privacy domain will have unexpected impact on the security domain. To bridge this gap, in this paper, we propose a novel Disguise-based Data Poisoning Attack (DDPA) against the differentially private crowdsensing systems empowered with the truth discovery method. Specifically, we propose a novel stealth strategy, i.e., disguising the malicious behavior as privacy behavior, to avoid being detected by truth discovery methods. With this stealth strategy, the shortcoming of failing to maximize the attack effectiveness is avoided naturally through structuring a bi-level optimization problem, which can be solved with the alternating optimization algorithm. Moreover, we show that the differentially private crowdsensing systems are vulnerable to data poisoning attacks, and enhancing the level of privacy will bring more serious security threats. Finally, the evaluation results on the real-world dataset Emotion and the synthetic dataset SynData demonstrate that DDPA can not only achieve maximum utility damage but also remain undetected.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call