Abstract

With an increasing popularity of Location-Based Services (LBSs), people's trajectories are continuously recorded and collected. The trajectory data are often shared or published for improving user experience, such as personalized recommendations and activity mining. However, releasing the trajectory data makes users' sensitive location visits vulnerable to inference attacks. In this paper, we study the problem of protecting sensitive location visits in the publication of trajectory data, assuming an adversary can do inference attacks using association rules derived from the data. We propose a methodology of anonymizing trajectories employing both generalizations and suppressions, to sanitize the trajectory data and protect sensitive location visits against inference attacks. We design a number of techniques to make our trajectory anonymizing algorithm efficient meanwhile maintaining the utility. We have conducted an empirical study to show that our algorithms can efficiently prevent inference attacks for real datasets while preserving the accuracy of aggregate querying on the published data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call