Abstract

Some credible third parties collect and share mobile sensor data with users to enhance scientific innovation. However, since the sensor data contains sensitive attributes, it may lead to an unexpected privacy leak. Despite privacy filtering, differential privacy, and inferential privacy techniques that attempt to address this problem, the preservation of greater privacy results in a considerable loss of utility and vice versa. In this research, we tackle the privacy-preserving for sensor-based activity recognition data-sharing issue via balancing between (1) data quality perspective and (2) privacy-preserving perspective. Two novel private generative adversarial approaches, namely PGAN1 and PGAN2, are proposed, whereby we generate filtered data to be shared before revealing the raw data to a data analyst. PGAN1 and PGAN2 differ in the privacy mechanism of raw data filtering. Therefore, PGAN1 relies on a transformation algorithm, while PGAN2 relies on a synthesis algorithm. We theoretically characterize the problem and formulate new objective functions, each one as an analogy of minimax formulation among more than two networks. We evaluate the utility performance by different classifiers while the privacy is checked by overcoming the attacks. Experimental results show that PGAN1 and PGAN2 improve utility and effectively prevent privacy leaks better than previous works.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.