We study privacy filters, which enable privacy accounting for differentially private (DP) mechanisms with adaptively chosen privacy characteristics. We develop a general theory that characterizes the worst-case privacy loss of an interaction involving an analyst that respects some restrictions on what queries they may issue. We apply this theory to develop residue filters, which unifies existing privacy filters. We develop the Gaussian DP (GDP) residue filter, which strictly improves upon the naïve GDP filter. We also show that residue filters capture the natural filter, which promises greater utility by leveraging exact privacy accounting techniques. Earlier privacy filters consider only simple privacy parameters such as Rényi-DP or GDP parameters. Natural filters account for the entire privacy profile of every query, promising more efficient use of a given privacy budget. We show that, contrary to other forms of DP, natural privacy filters are not free in general. We present a characterization of when a family of private queries admits free natural filters for a given budget. In particular, only families of privacy mechanisms that are totally-ordered when composed admit free natural privacy filters with respect to an arbitrary privacy budget. Finally, we show that, while the natural approximate-DP filter can fail in the presence of adaptive adversary, it cannot fail too badly: the output remains approximate-DP with parameters at most poly-logarithmically worse than the intended privacy parameters.
翻译:暂无翻译