Abstract

Synthetic data generation based on state-of-the-art deep learning methods has recently emerged as a promising solution to replace the expensive and laborious collection of real data. Accordingly, several deep learning approaches have been developed to generate synthetic trajectories. These existing solutions assume that a dataset of true trajectories is available with sufficient size to train a deep learning model. However, considering that the trajectories usually contain sensitive information that individuals do not wish to disclose, this assumption is unrealistic in real world applications. We propose a novel privacy-preserving framework designed to effectively generate synthetic trajectories. In contrast to existing solutions to this problem, the proposed method exploits a differential privacy mechanism for collecting training data from individual users to protect the privacy of their locations. A deep learning model for trajectory generation is then trained using the perturbed training dataset collected under differential privacy. We present experimental results to demonstrate that the proposed framework effectively exploits a dataset of perturbed trajectories to train a deep learning model and can therefore generate synthetic trajectories with distributions similar to those of real data. Experimental results on real world datasets show that our method achieves significantly better performance than baseline approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.