Individual Privacy Accounting via a Renyi Filter
authors Vitaly Feldman, Tijana Zrnic
We consider a sequential setting in which a single dataset of individuals is used to perform adaptively-chosen analyses, while ensuring that the differential privacy loss of each participant does not exceed a pre-specified privacy budget. The standard approach to this problem relies on bounding a worst-case estimate of the privacy loss over all individuals and all possible values of their data, for every single analysis. Yet, in many scenarios this approach is overly conservative, especially for "typical" data points which incur little privacy loss by participation in most of the analyses. In this work, we give a method for tighter privacy loss accounting based on the value of a personalized privacy loss estimate for each individual in each analysis. The accounting method relies on a new composition theorem for Rényi differential privacy, which allows adaptively-chosen privacy parameters. We apply our results to the analysis of noisy gradient descent and show how existing algorithms can be generalized to incorporate individual privacy accounting and thus achieve a better privacy-utility tradeoff.