Differential Privacy (DP) enables privacy-preserving data analysis by adding calibrated noise. While recent works extend DP to curved manifolds such as diffusion-tensor MRI or social networks by adding geodesic noise, these assume uniform data distribution and are not always practical. Hence, these approaches may introduce biased noise and suboptimal privacy-utility tradeoffs for non-uniform data. To address these shortcomings, we develop a density-aware differential privacy mechanism based on conformal transformations over Riemannian manifolds, which calibrates perturbations according to local density while preserving intrinsic geometric structure. We construct the conformal factor based on local kernel density estimates and establish that it inherently adapts to variations in data density. Our mechanism achieves a local balance of sample density and redefines geodesic distances while faithfully preserving the intrinsic geometry of the underlying manifold. We demonstrate that, through conformal transformation, our mechanism satisfies epsilon-differential privacy on any complete Riemannian manifold and derives a closed-form expected geodesic error bound that is contingent solely on the maximal density ratio, independent of global curvature. Empirical results on synthetic and real-world datasets demonstrate that our mechanism substantially improves the privacy-utility tradeoff in heterogeneous manifold settings and remains on par with state-of-the-art approaches when data are uniformly distributed.
翻译:暂无翻译