Real-location Reporting Based Differential Privacy Trajectory Protection for Mobile Crowdsensing

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Mobile Crowdsensing (MCS) has become an effective technology for urban data sensing and acquisition. But this also brings the risk of trajectory privacy disclosure for participants. Most of the existing efforts attempt to add noise into the reported location information to achieve the trajectory privacy protection of the participating users. However, in many scenarios, the participants are required to report the real-location information (e.g., high-quality map generation, traffic flow monitoring, etc.). To address this problem, we propose a differential privacy based trajectory privacy protection scheme with real-location reporting in MCS. First, we present the definition of trajectory privacy protection based on real path reporting under differential privacy. Second, we give a differential trajectory privacy protection framework to achieve participants trajectory privacy protection under Bayesian inference attacks. Finally, we analyze and prove that differential trajectory privacy problem is an NP-Hard problem. Meanwhile, we also design an approximate algorithm to report participants road segment with the trajectory privacy guarantee. The experimental results on both the simulated data set and the real data set show that our proposed participant's trajectory privacy protection scheme has a good performance.

Similar Papers
  • Conference Article
  • Cite Count Icon 8
  • 10.1109/glocom.2018.8647918
Trajectory Privacy Protection on Spatial Streaming Data with Differential Privacy
  • Dec 1, 2018
  • Xiang Liu + 3 more

Continuously sharing user's trajectory data which contain one's location information makes the crowd sensing of the traffic dynamics and mobility trends feasible. This kind of spatial streaming data is beneficial for intelligent transportation but at the risk of disclosing personal privacy, even if it is published in statistical form such as "the number of users in an area at time t". The user number on a location at time t is similar to that of previous release on the same location, and to that on adjacent locations. Such spatio-temporal correlation makes it a challenge to find solutions to protect user's trajectory privacy. The state-of-the-art privacy protection framework, differential privacy, has been extended to streaming scenario for preventing the privacy leak causing by the temporal correlation. However, such schemes neglect the importance of spatial correlation so that they may suffer the leak of user trajectory privacy or the degradation of data utility. Based on the observation that any piece of trajectory has temporal and spatial locality, we propose a flexible trajectory privacy model of w-event n2-block differential privacy, short as (ω, n)-differential privacy, to ensure any trajectory occurring in an area of n×n blocks during w successive timestamps under the protection of ε-differential privacy. Then we design the Spatial Temporal Budget Distribution (STBD) algorithm for achieving (ω, n)-differential privacy. Validation results of this algorithm on two real-life datasets and one synthetic dataset confirm its practicality.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 11
  • 10.3390/e24091172
Successive Trajectory Privacy Protection with Semantics Prediction Differential Privacy
  • Aug 23, 2022
  • Entropy
  • Jing Zhang + 4 more

The publication of trajectory data provides critical information for various location-based services, and it is critical to publish trajectory data safely while ensuring its availability. Differential privacy is a promising privacy protection technology for publishing trajectory data securely. Most of the existing trajectory privacy protection schemes do not take into account the user’s preference for location and the influence of semantic location. Besides, differential privacy for trajectory protection still has the problem of balance between the privacy budget and service quality. In this paper, a semantics- and prediction-based differential privacy protection scheme for trajectory data is proposed. Firstly, trajectory data are transformed into a prefix tree structure to ensure that they satisfy differential privacy. Secondly, considering the influence of semantic location on trajectory, semantic sensitivity combined with location check-in frequency is used to calculate the sensitivity of each position in the trajectory. The privacy level of the position is classified by setting thresholds. Moreover, the corresponding privacy budget is allocated according to the location privacy level. Finally, a Markov chain is used to predict the attack probability of each position in the trajectory. On this basis, the allocation of the privacy budget is further adjusted and its utilization rate is improved. Thus, the problem of the balance between the privacy budget and service quality is solved. Experimental results show that the proposed scheme is able to ensure data availability while protecting data privacy.

  • Research Article
  • Cite Count Icon 38
  • 10.1016/j.jnca.2020.102736
RNN-DP: A new differential privacy scheme base on Recurrent Neural Network for Dynamic trajectory privacy protection
  • Jul 2, 2020
  • Journal of Network and Computer Applications
  • Si Chen + 5 more

RNN-DP: A new differential privacy scheme base on Recurrent Neural Network for Dynamic trajectory privacy protection

  • Conference Article
  • Cite Count Icon 1
  • 10.1109/bdicn58493.2023.00016
Personalized Differential Privacy Preservation Method for Trajectory Based on Regional Density Analysis
  • Jan 1, 2023
  • Weicheng Zhi + 2 more

As a reliable privacy protection method, differential privacy has been widely used in trajectory data release. However, the current trajectory protection method based on differential privacy lacks the analysis of the privacy requirements of each user and allocates the same privacy budget to all trajectory location points of the user, which cannot provide personalized trajectory privacy protection according to user characteristics. Aiming at this problem, we propose a trajectory protection method based on area density analysis. Analyze the stay area of each user, calculate the stay point and reconstruct the trajectory set according to the time and distance thresholds. Using the minimum spanning tree clustering algorithm based on local density peaks to obtain the privacy-sensitive location points and active hotspot areas of the user's trajectory. According to the designed privacy importance degree expression, calculate privacy score of each sensitive location point, and assign them the proper privacy budget value. The experimental comparison on real data sets shows that this trajectory privacy protection method can better reduce the privacy budget waste and improve the availability of data compared with the traditional differential privacy trajectory protection method.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 1
  • 10.3390/math12162487
Child Health Dataset Publishing and Mining Based on Differential Privacy Preservation
  • Aug 12, 2024
  • Mathematics
  • Wenyu Li + 3 more

With the emergence and development of application requirements such as data analysis and publishing, it is particularly important to use differential privacy protection technology to provide more reliable, secure, and compliant datasets for research in the field of children’s health. This paper focuses on the differential privacy protection of the ultrasound examination health dataset of adolescents in southern Texas from three aspects: differential privacy protection with output perturbation on basic statistics, publication of differential privacy marginal histogram and synthesized data, and a machine learning differential privacy learning algorithm. Firstly, differential privacy protection results with output perturbation show that Laplace and Gaussian mechanisms for numerical data, as well as the exponential mechanism for non-numerical data, can achieve the goal of protecting privacy. The exponential mechanism provides higher privacy protection. Secondly, a differential privacy marginal histogram with four attributes can be obtained with an appropriate privacy budget that approximates the marginal histogram of the original data. In order to publish synthetic data, we construct a synthetic query to obtain the corresponding differential privacy histogram for two attributes. Further, a synthetic dataset can be constructed by following the data distribution of the original dataset and the quality of the synthetic data publication can also be evaluated by the mean square error and error rate. Finally, consider a differential privacy logistic regression model under machine learning to predict whether children have fatty liver in binary classification tasks. The experimental results show that the model combined with quadratic perturbation has better accuracy and privacy protection. This paper can provide differential privacy protection models under different demands, which provides important data release and analysis options for data managers and research organizations, in addition to enriching the research on child health data releasing and mining.

  • Research Article
  • Cite Count Icon 10
  • 10.3390/s23115219
An Efficient Differential Privacy-Based Method for Location Privacy Protection in Location-Based Services.
  • May 31, 2023
  • Sensors
  • Bo Wang + 3 more

Location-based services (LBS) are widely used due to the rapid development of mobile devices and location technology. Users usually provide precise location information to LBS to access the corresponding services. However, this convenience comes with the risk of location privacy disclosure, which can infringe upon personal privacy and security. In this paper, a location privacy protection method based on differential privacy is proposed, which efficiently protects users' locations, without degrading the performance of LBS. First, a location-clustering (L-clustering) algorithm is proposed to divide the continuous locations into different clusters based on the distance and density relationships among multiple groups. Then, a differential privacy-based location privacy protection algorithm (DPLPA) is proposed to protect users' location privacy, where Laplace noise is added to the resident points and centroids within the cluster. The experimental results show that the DPLPA achieves a high level of data utility, with minimal time consumption, while effectively protecting the privacy of location information.

  • PDF Download Icon
  • Research Article
  • 10.54097/0bkcjr92
Secure and Efficient k-anonymous Trajectory Privacy Protection Method based on Differential Privacy
  • Oct 10, 2024
  • Academic Journal of Science and Technology
  • Yuanlong Fan + 2 more

The Location-based service scheme have already involved in every aspect of People's daily life and are increasingly used in various industries. Aiming at the problem of the security and efficiency of mobile terminal users’ trajectory privacy protection in location-based service, we propose a k-anonymous trajectory privacy protection scheme based on differential privacy. This scheme adopts differential privacy technology to add Laplace noise to the user's trajectory many times to generate 2k noise trajectory, and then according to the trajectory similarity to determine k-1 noise users whose trajectory are similar to the user trajectory, and sets them and the real user as an anonymous user group, and then uses the anonymous user group to request LBS services. Security analysis shows that the scheme satisfies the security features of anonymity, unforgeability, and anti-counterfeiting attack. The simulation results show that the scheme not only guarantees the similarity between the false trajectory and the real trajectory but also has higher execution efficiency.

  • Research Article
  • Cite Count Icon 34
  • 10.1016/j.eswa.2021.115215
Differential privacy trajectory data protection scheme based on R-tree
  • May 27, 2021
  • Expert Systems with Applications
  • Shuilian Yuan + 3 more

Differential privacy trajectory data protection scheme based on R-tree

  • Research Article
  • Cite Count Icon 2
  • 10.1097/jbr.0000000000000099
A differential privacy protection query language for medical data: a proof-of-concept system validation
  • Jun 8, 2021
  • Journal of Bio-X Research
  • Huanhuan Wang + 4 more

Objective: Medical data mining and sharing is an important process in E-Health applications. However, because these data consist of a large amount of personal private information of patients, there is the risk of privacy disclosure when sharing and mining. Therefore, ensuring the security of medical big data in the process of publishing, sharing, and mining has become the focus of current research. The objective of our study is to design a framework based on a differential privacy protection mechanism to ensure the secure sharing of medical data. We developed a privacy protection query language (PQL) that integrates multiple data mining methods and provides a secure sharing function. Methods: This study is mainly performed in Xuzhou Medical University, China and designs three sub-modules: a parsing module, mining module, and noising module. Each module encapsulates different computing methods, such as a composite parser and a noise theory. In the PQL framework, we apply the differential privacy theory to the results of the computing between modules to guarantee the security of various mining algorithms. These computing devices operate independently, but the mining results depend on their cooperation. In addition, PQL is encapsulated in MNSSp3 that is a data mining and security sharing platform and the data comes from public data sets, such as UCBI. The public data set (NCBI database) was used as the experimental data, and the data collection time was January 2020. Results: We designed and developed a query language that provides functions for medical data mining, sharing, and privacy preservation. We theoretically proved the performance of the PQL framework. The experimental results show that the PQL framework can ensure the security of each mining result and the availability of the output results is above 97%. Conclusion: Our framework enables medical data providers to securely share health data or treatment data and develops a usable query language, based on a differential privacy mechanism, that enables researchers to mine information securely using data mining algorithms.

  • Research Article
  • Cite Count Icon 26
  • 10.1109/lcomm.2020.3003997
A Real-Time Data Collection Mechanism With Trajectory Privacy in Mobile Crowd-Sensing
  • Oct 1, 2020
  • IEEE Communications Letters
  • Xin Niu + 2 more

As a new paradigm to serve and sense the intelligent city, mobile crowd-sensing (MCS) usually requires participants’ real-time locations. However, uploading participants’ true locations to servers or third parties raises privacy concerns. In this letter, we propose a real-time data collection mechanism with trajectory privacy (RDCTP) in MCS, which achieves $w$ -event $\varepsilon $ -differential privacy for the crowd-sensing participants. Different from existing works, we focus on protecting the privacy of trajectories instead of individual locations. Specifically, RDCTP provides $\varepsilon $ -differential privacy for each sub-trajectory which consists of successive $w$ locations. To achieve this, a participant first allocates the trajectory privacy budget to each location. Then, he perturbs his true location and gets candidate location set which satisfies $\varepsilon $ -differential privacy. Last, he submits a location from the set by solving an optimization problem that aims to tradeoff between the privacy and utility. We utilize real world traffic trajectories of Shanghai taxis to evaluate the RDCTP, and the results show that it not only protects participants’ privacy, but also preserves the server’s utility.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 1
  • 10.1155/2018/9867061
A New Differential Privacy Crowdsensing Scheme Based on the Multilevel Interactive Game
  • Jan 1, 2018
  • Wireless Communications and Mobile Computing
  • Sungwook Kim

With the rapid growth of network of devices with embedded technology, mobile crowdsensing (MCS) has been gaining increasing popularity. The development of 5G network services is prompting further growth in crowdsensing applications. However, MCS participants risk their privacy when reporting data with their actual sensing positions. To address this issue, the concept of differential privacy (DP) can be adopted to provide a theoretical guarantee for participants’ privacy in MCS services. In this study, we design a new DP crowdsensing scheme with game theory. Based on the multilevel interactive game model, MCS server, DP controllers, and mobile devices are regarded as rational individual decision makers that aim to maximize their own payoffs. For these decision makers, the proposed game approach analyzes suitably the competitive and coordinative MCS environments. The main novelty possessed by our control scheme is to capture the dynamics of MCS system operations with the privacy consideration. Compared with other existing protocols, performance evaluation shows the advantages of our proposed scheme in terms of the sensing task success ratio, MCS participating ratio, and normalized payoff of participating devices. Finally, we provide the guidance on the future research direction of MCS services including other issues.

  • Research Article
  • Cite Count Icon 14
  • 10.1016/j.adhoc.2020.102303
Differential privacy protection on weighted graph in wireless networks
  • Sep 23, 2020
  • Ad Hoc Networks
  • Bo Ning + 3 more

Differential privacy protection on weighted graph in wireless networks

  • Research Article
  • Cite Count Icon 65
  • 10.1109/tbdata.2017.2777862
Correlated Differential Privacy Protection for Mobile Crowdsensing
  • Jan 1, 2017
  • IEEE Transactions on Big Data
  • Jianwei Chen + 3 more

Mobile CrowdSensing (MCS) is a new paradigm that leverages pervasive mobile devices to efficiently collect the big sensory data, enabling various large-scale applications. However, people's concerns about the loss of individual privacy seriously hinder the prevalence of MCS applications. Differential privacy is widely focused owing to its rigorous definition and strong privacy guarantee, but the state-of-the-art studies still demonstrate its weakness on correlated data, resulting in compromising individual privacy. In this paper, we investigate the influence of sensing data correlation on differential privacy protection for MCS systems, and explore the perturbation mechanisms from two different perspectives. From a protector's perspective, based on the Bayesian Network to model the probabilistic relationship among sensing data, we use the classical definition of differential privacy to deduce the scale parameter, and present one perturbation mechanism. From an adversary's perspective, based on the Gaussian correlation model to describe the data correlation, we analyze the importance of the maximum correlated group to compute the Bayesian differential privacy leakage, and then provide another perturbation mechanism. Compared with the existing solutions, our mechanisms are applicable to arbitrary aggregate query function, and can avoid introducing too much noise. Moreover, we demonstrate the effectiveness of our mechanisms through extensive simulations.

  • Research Article
  • Cite Count Icon 56
  • 10.1109/jiot.2020.3001381
PAPU: Pseudonym Swap With Provable Unlinkability Based on Differential Privacy in VANETs
  • Jun 15, 2020
  • IEEE Internet of Things Journal
  • Xinghua Li + 7 more

Nowadays, the pseudonym swap has become the mainstream technology for protecting vehicles’ trajectory privacy in vehicle ad hoc networks. However, the existing pseudonym swap methods cannot strictly provide the unlinkability between the new pseudonym and old pseudonym of the vehicle due to the lack of theoretical privacy guarantee, resulting in severe leakages of vehicles’ trajectory privacy. Our experiment also proves this point and we find that existing works may cause vehicle’s pseudonyms to be linked with a probability higher than 60% because they always choose two vehicles with very different driving states (e.g., speeds, directions, and positions) to swap their pseudonyms. To solve this issue, we first give a formal privacy definition based on generalized differential privacy, called pseudonym indistinguishability, to provide a strict unlinkability for pseudonym swap. Then, we design an appropriate utility metric and a new pseudonym swap mechanism, which selects a pseudonym for a vehicle by adapting a differential privacy exponential mechanism to satisfy pseudonym indistinguishability. Abstracting from attackers’ prior knowledge, we can strictly guarantee that if two vehicles have a high similarity of driving states, it is impossible for attackers to link the vehicles and their pseudonyms after the swap. Theoretical analyses prove that our mechanism satisfies the proposed privacy definition, thus ensuring the unlinkability between the new pseudonym and the old pseudonym. Extensive experiments on a real data set show that our work only requires about 50% of pseudonym quantities compared to other works and can make the vehicle successfully complete the swap process with a probability of more than 90%, which is higher than any of existing works.

  • Conference Article
  • Cite Count Icon 6
  • 10.1109/bdai56143.2022.9862753
Localized Differential Location Privacy Protection Scheme in Mobile Environment
  • Jul 8, 2022
  • Liu Kai + 2 more

When users request location services, they are easy to expose their privacy information, and the scheme of using a third-party server for location privacy protection has high requirements for the credibility of the server. To solve these problems, a localized differential privacy protection scheme in mobile environment is proposed, which uses Markov chain model to generate probability transition matrix, and adds Laplace noise to construct a location confusion function that meets differential privacy, Conduct location confusion on the client, construct and upload anonymous areas. Through the analysis of simulation experiments, the scheme can solve the problem of untrusted third-party server, and has high efficiency while ensuring the high availability of the generated anonymous area.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.