Abstract
Private individual information are increasingly exposed through high-dimensional and high-order data, with the wide deployment of learning techniques. These data are typically expressed in form of tensors, but there is no principled way to guarantee privacy for tensor-valued queries. Conventional differential privacy is typically applied to scalar values without a precise definition on the shape of the queried data. Realizing that the conventional mechanisms do not take the data structural information into account, we propose <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Tensor Variate Gaussian</i> (TVG), a new <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$(\epsilon,\delta) $ </tex-math></inline-formula> -differential privacy mechanism for tensor-valued queries. We further introduce two mechanisms based on TVG with an improved utility by imposing the unimodal differentially-private noise. With the utility space available, the proposed mechanisms can be instantiated with an optimized utility, and the optimization problem has a closed-form solution scalable to large-scale problems. Finally, we experimentally test our mechanisms on a variety of datasets and models, demonstrating that TVG is superior than other state-of-the-art mechanisms on tensor-valued queries.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Information Forensics and Security
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.