Differential Privacy with Manifold Data Dependency

29 Sep 2021  ·  Lei Wang, Deming Yuan, Guodong Shi ·

In this paper, we study dataset processing mechanisms generated by linear queries in the presence of manifold data dependency. Specifically, the input data are assumed to lie in an affine manifold as prior knowledge known to adversaries. First of all, we show such manifold data dependency may have a significant impact on the privacy levels compared to the case with the manifold constraint being absent. We establish necessary and sufficient conditions on the possibility of achieving differential privacy via structured noise injection mechanisms where non i.i.d. Gaussian or Laplace noises are calibrated into dataset. Next, in light of these conditions, procedures are developed by which a prescribed privacy budget can be tightly reached with a matching noise level. Finally, we show that the framework has immediate applications in differentially private cloud-based control, where the manifold data dependency arises naturally from the system dynamics, and the proposed theories and procedures become effective tools in evaluating privacy levels and in the design of provably useful algorithms.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here