Participatory-sensing systems leverage mobile phones to offer unprecedented services that improve users’ quality of life. However, the data collection process may compromise participants’ privacy when reporting measurements tagged or correlated with their sensitive information. Therefore, existing privacy-preserving techniques introduce data perturbation, which ensures privacy guarantees, yet at the cost of a loss of data utility, a major concern for queriers. Different from past works, we assess simultaneously the two competing goals of ensuring data quality for queriers and protecting participants’ privacy. We propose a general privacy-preserving mechanism to capture the privacy inference threat encountered by a participant while considering utility requirements set by data queriers. We rely on a general probabilistic privacy mechanism, which is run on a trust-worthy entity to distort the collected data before its release. We consider two different adversary models and propose appropriate solutions for the both of them. Furthermore, we tackle the challenge of participatory collected data with large size alphabets by investigating quantization techniques. The proposed PRivacy-preserving Utility-aware Mechanism, PRUM, was evaluated on three different real datasets while varying the distribution of the collected data and the obfuscation type. The obtained results demonstrate that, for different applications, a limited distortion may ensure the participants’ privacy while maintaining about 98 percent of the required data utility.
Read full abstract