Abstract

Abstract This thesis addresses the timely concern of protecting privacy in the age of big data. We identify the two following problems as the fundamental problems in computational privacy: (i) consistently quantifying privacy in different systems and (ii) optimally protecting privacy using obfuscation mechanisms. We cast the problem of quantifying privacy as computing the estimation error in a statistical (Bayesian) inference problem, where an adversary combines his observation, background knowledge and side channel information to estimate the user's sensitive information. This enables us to evaluate privacy of users in different systems, and consistently compare the effectiveness of different privacy protection mechanisms. We also formulate the problem of optimizing user privacy while respecting data utility as an interactive optimization problem (Bayesian Stackelberg game), where both user and adversary want to maximize their own objectives which are in conflict with each other. We apply our methodologies to quantifying and protecting location privacy in location-based services. We also provide an open-source tool, named Location-Privacy and Mobility Meter (LPM), that enables researchers to learn and analyze human mobility models as well as evaluating and comparing different location-privacy preserving mechanisms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call