Abstract

The widespread use of location-based services (LBSs), which allows untrusted service provider to collect large number of user request records, leads to serious privacy concerns. In response to these issues, a number of LBS privacy protection mechanisms (LPPMs) have been recently proposed. However, the evaluation of these LPPMs usually disregards the background knowledge that the adversary may possess about users’ contextual information, which runs the risk of wrongly evaluating users’ query privacy. In this paper, we address these issues by proposing a generic formal quantification framework,which comprehensively contemplate the various elements that influence the query privacy of users and explicitly states the knowledge that an adversary might have in the context of query privacy. Moreover, a way to model the adversary’s attack on query privacy is proposed, which allows us to show the insufficiency of the existing query privacy metrics, e.g., k-anonymity. Thus we propose two new metrics: entropy anonymity and mutual information anonymity. Lastly, we run a set of experiments on datasets generated by network based generator of moving objects proposed by Thomas Brinkhoff. The results show the effectiveness and efficient of our framework to measure the LPPM.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.