Ruhr-Uni-Bochum

Lower Bounds for Rényi Differential Privacy in a Black-Box Setting

2024

Konferenz / Medium

Research Hub

Research Hub A: Kryptographie der Zukunft

Research Challenges

RC 3: Foundations of Privacy

Abstract

We present new methods for assessing the privacy guarantees of an algorithm with regard to Rényi Differential Privacy. To the best of our knowledge, this work is the first to address this problem in a black-box scenario, where only algorithmic outputs are available. To quantify privacy leakage, we devise a new estimator for the Rényi divergence of a pair of output distributions. This estimator is transformed into a statistical lower bound that is proven to hold for large samples with high probability. Our method is applicable for a broad class of algorithms, including many well-known examples from the privacy literature. We demonstrate the effectiveness of our approach by experiments encompassing algorithms and privacy enhancing methods that have not been considered in related works.

Tags

Differential Privacy